Oct 10 06:24:11 crc systemd[1]: Starting Kubernetes Kubelet... Oct 10 06:24:11 crc restorecon[4729]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:11 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:24:12 crc restorecon[4729]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 10 06:24:13 crc kubenswrapper[4822]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:24:13 crc kubenswrapper[4822]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 10 06:24:13 crc kubenswrapper[4822]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:24:13 crc kubenswrapper[4822]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:24:13 crc kubenswrapper[4822]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 10 06:24:13 crc kubenswrapper[4822]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.383761 4822 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.391954 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392005 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392016 4822 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392025 4822 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392033 4822 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392055 4822 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392064 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392072 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392081 4822 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392090 4822 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392098 4822 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392107 4822 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392115 4822 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392123 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392130 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392138 4822 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392145 4822 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392153 4822 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392161 4822 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392172 4822 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392182 4822 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392191 4822 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392200 4822 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392208 4822 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392216 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392225 4822 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392234 4822 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392244 4822 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392252 4822 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392260 4822 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392268 4822 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392289 4822 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392298 4822 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392306 4822 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392314 4822 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392322 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392329 4822 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392337 4822 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392345 4822 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392353 4822 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392360 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392368 4822 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392376 4822 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392384 4822 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392391 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392402 4822 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392412 4822 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392421 4822 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392431 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392440 4822 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392449 4822 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392457 4822 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392465 4822 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392473 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392480 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392488 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392495 4822 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392503 4822 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392511 4822 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392524 4822 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392533 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392542 4822 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392550 4822 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392558 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392565 4822 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392573 4822 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392580 4822 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392595 4822 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392605 4822 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392614 4822 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.392622 4822 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.393954 4822 flags.go:64] FLAG: --address="0.0.0.0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.393980 4822 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394004 4822 flags.go:64] FLAG: --anonymous-auth="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394017 4822 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394029 4822 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394038 4822 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394050 4822 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394061 4822 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394070 4822 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394079 4822 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394089 4822 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394099 4822 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394109 4822 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394118 4822 flags.go:64] FLAG: --cgroup-root="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394127 4822 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394136 4822 flags.go:64] FLAG: --client-ca-file="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394145 4822 flags.go:64] FLAG: --cloud-config="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394153 4822 flags.go:64] FLAG: --cloud-provider="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394162 4822 flags.go:64] FLAG: --cluster-dns="[]" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394176 4822 flags.go:64] FLAG: --cluster-domain="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394184 4822 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394194 4822 flags.go:64] FLAG: --config-dir="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394203 4822 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394212 4822 flags.go:64] FLAG: --container-log-max-files="5" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394224 4822 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394233 4822 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394243 4822 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394253 4822 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394263 4822 flags.go:64] FLAG: --contention-profiling="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394273 4822 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394285 4822 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394294 4822 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394304 4822 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394315 4822 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394324 4822 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394333 4822 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394342 4822 flags.go:64] FLAG: --enable-load-reader="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394351 4822 flags.go:64] FLAG: --enable-server="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394363 4822 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394374 4822 flags.go:64] FLAG: --event-burst="100" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394383 4822 flags.go:64] FLAG: --event-qps="50" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394393 4822 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394403 4822 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394412 4822 flags.go:64] FLAG: --eviction-hard="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394422 4822 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394431 4822 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394441 4822 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394450 4822 flags.go:64] FLAG: --eviction-soft="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394459 4822 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394467 4822 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394477 4822 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394486 4822 flags.go:64] FLAG: --experimental-mounter-path="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394495 4822 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394504 4822 flags.go:64] FLAG: --fail-swap-on="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394514 4822 flags.go:64] FLAG: --feature-gates="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394524 4822 flags.go:64] FLAG: --file-check-frequency="20s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394534 4822 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394543 4822 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394552 4822 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394562 4822 flags.go:64] FLAG: --healthz-port="10248" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394571 4822 flags.go:64] FLAG: --help="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394581 4822 flags.go:64] FLAG: --hostname-override="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394590 4822 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394599 4822 flags.go:64] FLAG: --http-check-frequency="20s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394607 4822 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394616 4822 flags.go:64] FLAG: --image-credential-provider-config="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394625 4822 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394634 4822 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394643 4822 flags.go:64] FLAG: --image-service-endpoint="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394652 4822 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394661 4822 flags.go:64] FLAG: --kube-api-burst="100" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394671 4822 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394680 4822 flags.go:64] FLAG: --kube-api-qps="50" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394689 4822 flags.go:64] FLAG: --kube-reserved="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394698 4822 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394706 4822 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394715 4822 flags.go:64] FLAG: --kubelet-cgroups="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394724 4822 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394733 4822 flags.go:64] FLAG: --lock-file="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394742 4822 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394751 4822 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394760 4822 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394774 4822 flags.go:64] FLAG: --log-json-split-stream="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394782 4822 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394791 4822 flags.go:64] FLAG: --log-text-split-stream="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394824 4822 flags.go:64] FLAG: --logging-format="text" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394834 4822 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394844 4822 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394853 4822 flags.go:64] FLAG: --manifest-url="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394862 4822 flags.go:64] FLAG: --manifest-url-header="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394873 4822 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394883 4822 flags.go:64] FLAG: --max-open-files="1000000" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394894 4822 flags.go:64] FLAG: --max-pods="110" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394903 4822 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394912 4822 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394921 4822 flags.go:64] FLAG: --memory-manager-policy="None" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394930 4822 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394940 4822 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394952 4822 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394963 4822 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394990 4822 flags.go:64] FLAG: --node-status-max-images="50" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.394999 4822 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395008 4822 flags.go:64] FLAG: --oom-score-adj="-999" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395018 4822 flags.go:64] FLAG: --pod-cidr="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395027 4822 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395044 4822 flags.go:64] FLAG: --pod-manifest-path="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395056 4822 flags.go:64] FLAG: --pod-max-pids="-1" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395067 4822 flags.go:64] FLAG: --pods-per-core="0" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395076 4822 flags.go:64] FLAG: --port="10250" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395085 4822 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395094 4822 flags.go:64] FLAG: --provider-id="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395103 4822 flags.go:64] FLAG: --qos-reserved="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395113 4822 flags.go:64] FLAG: --read-only-port="10255" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395122 4822 flags.go:64] FLAG: --register-node="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395131 4822 flags.go:64] FLAG: --register-schedulable="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395140 4822 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395155 4822 flags.go:64] FLAG: --registry-burst="10" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395165 4822 flags.go:64] FLAG: --registry-qps="5" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395174 4822 flags.go:64] FLAG: --reserved-cpus="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395182 4822 flags.go:64] FLAG: --reserved-memory="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395193 4822 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395202 4822 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395212 4822 flags.go:64] FLAG: --rotate-certificates="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395221 4822 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395230 4822 flags.go:64] FLAG: --runonce="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395241 4822 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395252 4822 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395261 4822 flags.go:64] FLAG: --seccomp-default="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395270 4822 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395279 4822 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395288 4822 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395297 4822 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395307 4822 flags.go:64] FLAG: --storage-driver-password="root" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395317 4822 flags.go:64] FLAG: --storage-driver-secure="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395326 4822 flags.go:64] FLAG: --storage-driver-table="stats" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395335 4822 flags.go:64] FLAG: --storage-driver-user="root" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395343 4822 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395352 4822 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395361 4822 flags.go:64] FLAG: --system-cgroups="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395370 4822 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395384 4822 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395393 4822 flags.go:64] FLAG: --tls-cert-file="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395402 4822 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395413 4822 flags.go:64] FLAG: --tls-min-version="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395422 4822 flags.go:64] FLAG: --tls-private-key-file="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395431 4822 flags.go:64] FLAG: --topology-manager-policy="none" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395440 4822 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395448 4822 flags.go:64] FLAG: --topology-manager-scope="container" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395457 4822 flags.go:64] FLAG: --v="2" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395470 4822 flags.go:64] FLAG: --version="false" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395481 4822 flags.go:64] FLAG: --vmodule="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395492 4822 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.395502 4822 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395710 4822 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395722 4822 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395731 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395740 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395748 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395757 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395765 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395773 4822 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395782 4822 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395789 4822 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395797 4822 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395829 4822 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395837 4822 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395845 4822 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395853 4822 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395861 4822 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395869 4822 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395877 4822 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395885 4822 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395892 4822 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395900 4822 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395908 4822 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395916 4822 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395931 4822 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395939 4822 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395946 4822 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395954 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395962 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395970 4822 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395979 4822 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395987 4822 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.395995 4822 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396003 4822 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396011 4822 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396018 4822 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396026 4822 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396034 4822 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396042 4822 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396050 4822 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396058 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396066 4822 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396077 4822 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396086 4822 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396096 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396104 4822 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396113 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396123 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396131 4822 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396140 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396147 4822 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396155 4822 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396162 4822 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396170 4822 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396179 4822 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396187 4822 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396195 4822 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396202 4822 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396213 4822 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396224 4822 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396235 4822 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396245 4822 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396254 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396264 4822 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396275 4822 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396285 4822 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396293 4822 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396302 4822 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396311 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396318 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396328 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.396336 4822 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.396360 4822 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.409241 4822 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.409291 4822 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409361 4822 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409380 4822 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409385 4822 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409389 4822 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409393 4822 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409397 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409401 4822 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409405 4822 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409409 4822 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409413 4822 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409417 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409421 4822 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409426 4822 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409430 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409434 4822 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409438 4822 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409442 4822 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409447 4822 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409451 4822 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409455 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409460 4822 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409467 4822 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409472 4822 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409477 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409481 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409485 4822 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409490 4822 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409497 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409501 4822 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409506 4822 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409511 4822 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409515 4822 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409519 4822 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409532 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409539 4822 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409544 4822 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409548 4822 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409553 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409557 4822 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409561 4822 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409566 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409570 4822 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409577 4822 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409586 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409593 4822 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409598 4822 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409603 4822 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409609 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409613 4822 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409617 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409622 4822 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409627 4822 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409631 4822 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409636 4822 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409641 4822 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409645 4822 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409651 4822 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409655 4822 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409660 4822 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409664 4822 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409669 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409673 4822 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409677 4822 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409683 4822 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409688 4822 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409692 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409698 4822 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409704 4822 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409709 4822 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409713 4822 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409718 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.409725 4822 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409861 4822 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409870 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409875 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409879 4822 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409883 4822 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409887 4822 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409891 4822 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409895 4822 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409898 4822 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409902 4822 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409905 4822 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409909 4822 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409913 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409917 4822 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409922 4822 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409926 4822 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409931 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409937 4822 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409942 4822 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409947 4822 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409951 4822 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409956 4822 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409960 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409964 4822 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409968 4822 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409972 4822 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409975 4822 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409979 4822 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409982 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409986 4822 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409989 4822 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409992 4822 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.409996 4822 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410000 4822 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410007 4822 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410011 4822 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410014 4822 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410018 4822 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410021 4822 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410025 4822 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410028 4822 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410032 4822 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410035 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410039 4822 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410043 4822 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410046 4822 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410050 4822 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410054 4822 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410058 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410061 4822 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410065 4822 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410068 4822 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410073 4822 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410077 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410081 4822 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410085 4822 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410099 4822 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410102 4822 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410106 4822 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410110 4822 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410114 4822 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410119 4822 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410123 4822 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410127 4822 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410130 4822 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410134 4822 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410138 4822 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410141 4822 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410144 4822 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410148 4822 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.410152 4822 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.410158 4822 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.413661 4822 server.go:940] "Client rotation is on, will bootstrap in background" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.422858 4822 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.423010 4822 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.425301 4822 server.go:997] "Starting client certificate rotation" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.425340 4822 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.425661 4822 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 06:45:18.041175716 +0000 UTC Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.425831 4822 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2088h21m4.615349346s for next certificate rotation Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.466013 4822 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.470148 4822 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.488967 4822 log.go:25] "Validated CRI v1 runtime API" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.525931 4822 log.go:25] "Validated CRI v1 image API" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.527740 4822 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.535406 4822 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-10-06-18-10-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.535445 4822 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.551434 4822 manager.go:217] Machine: {Timestamp:2025-10-10 06:24:13.547918437 +0000 UTC m=+0.643076653 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b9d6aaf9-9893-464a-9c1f-35cedc127eea BootID:24930614-984a-4687-af00-12fa4519901f Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1b:8c:7b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1b:8c:7b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f4:cb:e3 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9d:f5:5f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a8:e6:e1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:20:28:34 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a9:6a:2a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f2:62:ae:25:53:b6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:01:3f:3c:7c:f8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.551710 4822 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.551921 4822 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.553755 4822 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.554052 4822 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.554092 4822 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.554328 4822 topology_manager.go:138] "Creating topology manager with none policy" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.554339 4822 container_manager_linux.go:303] "Creating device plugin manager" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.555080 4822 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.555115 4822 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.556848 4822 state_mem.go:36] "Initialized new in-memory state store" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.556965 4822 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.561902 4822 kubelet.go:418] "Attempting to sync node with API server" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.561952 4822 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.561990 4822 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.562031 4822 kubelet.go:324] "Adding apiserver pod source" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.562069 4822 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.573104 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.573402 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.573534 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.573850 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.573965 4822 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.576425 4822 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.578356 4822 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580250 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580288 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580296 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580306 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580319 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580325 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580333 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580347 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580363 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580373 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580387 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.580395 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.582821 4822 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.583432 4822 server.go:1280] "Started kubelet" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.583468 4822 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.584665 4822 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.585129 4822 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.585581 4822 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 10 06:24:13 crc systemd[1]: Started Kubernetes Kubelet. Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.586915 4822 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.586963 4822 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.587007 4822 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:41:54.883951621 +0000 UTC Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.587070 4822 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1473h17m41.296885945s for next certificate rotation Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.587160 4822 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.587167 4822 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.587210 4822 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.587298 4822 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.588108 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.588188 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.588327 4822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="200ms" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.588904 4822 factory.go:55] Registering systemd factory Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.588931 4822 factory.go:221] Registration of the systemd container factory successfully Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.589616 4822 server.go:460] "Adding debug handlers to kubelet server" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.590107 4822 factory.go:153] Registering CRI-O factory Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.590226 4822 factory.go:221] Registration of the crio container factory successfully Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.590432 4822 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.590553 4822 factory.go:103] Registering Raw factory Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.590701 4822 manager.go:1196] Started watching for new ooms in manager Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.591404 4822 manager.go:319] Starting recovery of all containers Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.592885 4822 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.180:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186d0d1e66031952 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-10 06:24:13.583391058 +0000 UTC m=+0.678549264,LastTimestamp:2025-10-10 06:24:13.583391058 +0000 UTC m=+0.678549264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606727 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606786 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606814 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606825 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606836 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606846 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.606856 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607154 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607172 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607184 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607195 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607207 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607218 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607234 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607247 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607260 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607273 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607288 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607329 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607342 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607353 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607364 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607394 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607408 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607435 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607448 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607461 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607473 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607484 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607496 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607505 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607517 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607528 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607539 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607566 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607578 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607588 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607599 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607613 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607623 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607637 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607650 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607660 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607671 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607699 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607712 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607723 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607746 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607758 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607769 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607780 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607793 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607842 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607857 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607869 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607882 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607897 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607909 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607920 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607953 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607965 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607980 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.607994 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608008 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608020 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608035 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608048 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608064 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608078 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608091 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608106 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608120 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608137 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608151 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608167 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608184 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608196 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608210 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608222 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608233 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608245 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608256 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608267 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608278 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608294 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608311 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608324 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608335 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608350 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608361 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608372 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608383 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608394 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608404 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608416 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608426 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608435 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608446 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608456 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608467 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608479 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608491 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608503 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608513 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608534 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608545 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608557 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608569 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608580 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608594 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608605 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608618 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608628 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608641 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608654 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608666 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608677 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608691 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608703 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608714 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608729 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608741 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608753 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608764 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608775 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608785 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608809 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.608821 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609123 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609136 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609146 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609155 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609165 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609176 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609190 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609204 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609221 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609234 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609247 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609259 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609272 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609294 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609318 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609332 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609344 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609360 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609382 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609398 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609419 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609439 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609454 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609466 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609479 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609488 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609497 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609507 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609518 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609528 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609538 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609548 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609559 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609570 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.609582 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.611135 4822 manager.go:324] Recovery completed Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613121 4822 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613192 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613212 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613225 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613241 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613259 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613273 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613287 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613302 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613314 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613327 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613340 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613353 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613364 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613379 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613391 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613406 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613419 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613445 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613465 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613482 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613501 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613519 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613534 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613550 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613568 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613583 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613599 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613616 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613631 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613646 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613660 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613676 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613689 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613702 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613714 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613729 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613749 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613781 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613795 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613826 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613839 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613851 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613863 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613878 4822 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613890 4822 reconstruct.go:97] "Volume reconstruction finished" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.613900 4822 reconciler.go:26] "Reconciler: start to sync state" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.624062 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.626054 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.626091 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.626103 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.627007 4822 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.627038 4822 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.627123 4822 state_mem.go:36] "Initialized new in-memory state store" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.642329 4822 policy_none.go:49] "None policy: Start" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.643618 4822 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.643643 4822 state_mem.go:35] "Initializing new in-memory state store" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.647033 4822 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.648966 4822 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.649013 4822 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.649050 4822 kubelet.go:2335] "Starting kubelet main sync loop" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.649208 4822 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 10 06:24:13 crc kubenswrapper[4822]: W1010 06:24:13.652040 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.652159 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.687496 4822 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.708168 4822 manager.go:334] "Starting Device Plugin manager" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.708231 4822 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.708246 4822 server.go:79] "Starting device plugin registration server" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.709118 4822 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.709139 4822 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.709395 4822 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.709466 4822 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.709479 4822 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.719230 4822 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.750396 4822 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.750590 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.752475 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.752543 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.752559 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.752913 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.753602 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.753683 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.754400 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.754444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.754457 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.754602 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.754735 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.754774 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755482 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755495 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755697 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755729 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.755942 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.756192 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.756281 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.756540 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.756580 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.756592 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757039 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757068 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757080 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757303 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757407 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757437 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757714 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.757744 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.758155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.758175 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.758182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.759953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.760027 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.760050 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.760515 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.760578 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.763762 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.763837 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.763851 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.789932 4822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="400ms" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.809275 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.810789 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.810890 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.810905 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.810945 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:13 crc kubenswrapper[4822]: E1010 06:24:13.811655 4822 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.816962 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.816997 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817021 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817038 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817059 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817075 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817094 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817187 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817266 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817356 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817409 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817472 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817521 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817552 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.817601 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919345 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919892 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919928 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919951 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919994 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.919671 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920048 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920084 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920014 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920128 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920179 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920135 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920193 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920146 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920153 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920332 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920351 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920368 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920386 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920402 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920418 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920435 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920749 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920839 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920828 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920796 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920866 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920797 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:13 crc kubenswrapper[4822]: I1010 06:24:13.920875 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.012724 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.015044 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.015108 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.015125 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.015164 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:14 crc kubenswrapper[4822]: E1010 06:24:14.015946 4822 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.085604 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.092425 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.111469 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.122930 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.128558 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 10 06:24:14 crc kubenswrapper[4822]: W1010 06:24:14.134764 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0a750b73242a9a8093898518523bc5eba5f2d3147bcfb40b86fa74d599d99330 WatchSource:0}: Error finding container 0a750b73242a9a8093898518523bc5eba5f2d3147bcfb40b86fa74d599d99330: Status 404 returned error can't find the container with id 0a750b73242a9a8093898518523bc5eba5f2d3147bcfb40b86fa74d599d99330 Oct 10 06:24:14 crc kubenswrapper[4822]: W1010 06:24:14.135964 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6989f17ef2e522743098d76d396de48c0c64ee7b93e70317fe22b11789f8d6f0 WatchSource:0}: Error finding container 6989f17ef2e522743098d76d396de48c0c64ee7b93e70317fe22b11789f8d6f0: Status 404 returned error can't find the container with id 6989f17ef2e522743098d76d396de48c0c64ee7b93e70317fe22b11789f8d6f0 Oct 10 06:24:14 crc kubenswrapper[4822]: W1010 06:24:14.146205 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-462dfcf6830b3d607e2bef9d6b8ae38ee72e65a5946109902792e1787e1f31ac WatchSource:0}: Error finding container 462dfcf6830b3d607e2bef9d6b8ae38ee72e65a5946109902792e1787e1f31ac: Status 404 returned error can't find the container with id 462dfcf6830b3d607e2bef9d6b8ae38ee72e65a5946109902792e1787e1f31ac Oct 10 06:24:14 crc kubenswrapper[4822]: W1010 06:24:14.148283 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-be33d4775333f3ad125bb7562d37f1a5bc6d858fb700451c3cd2d565e7081c93 WatchSource:0}: Error finding container be33d4775333f3ad125bb7562d37f1a5bc6d858fb700451c3cd2d565e7081c93: Status 404 returned error can't find the container with id be33d4775333f3ad125bb7562d37f1a5bc6d858fb700451c3cd2d565e7081c93 Oct 10 06:24:14 crc kubenswrapper[4822]: E1010 06:24:14.191085 4822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="800ms" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.416284 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.417726 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.417759 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.417782 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.417835 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:14 crc kubenswrapper[4822]: E1010 06:24:14.418334 4822 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 10 06:24:14 crc kubenswrapper[4822]: W1010 06:24:14.447828 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:14 crc kubenswrapper[4822]: E1010 06:24:14.447950 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:14 crc kubenswrapper[4822]: W1010 06:24:14.552230 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:14 crc kubenswrapper[4822]: E1010 06:24:14.552325 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.584942 4822 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.654745 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f8aab2e1371d470cda553a5ab8b5e0a440e1d4171c2294d02a829bab131c477"} Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.655906 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6989f17ef2e522743098d76d396de48c0c64ee7b93e70317fe22b11789f8d6f0"} Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.656871 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0a750b73242a9a8093898518523bc5eba5f2d3147bcfb40b86fa74d599d99330"} Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.658424 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"be33d4775333f3ad125bb7562d37f1a5bc6d858fb700451c3cd2d565e7081c93"} Oct 10 06:24:14 crc kubenswrapper[4822]: I1010 06:24:14.659451 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"462dfcf6830b3d607e2bef9d6b8ae38ee72e65a5946109902792e1787e1f31ac"} Oct 10 06:24:14 crc kubenswrapper[4822]: E1010 06:24:14.992633 4822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="1.6s" Oct 10 06:24:15 crc kubenswrapper[4822]: W1010 06:24:15.115267 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:15 crc kubenswrapper[4822]: E1010 06:24:15.115449 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:15 crc kubenswrapper[4822]: W1010 06:24:15.160325 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:15 crc kubenswrapper[4822]: E1010 06:24:15.160568 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.218863 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.221073 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.221112 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.221122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.221205 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:15 crc kubenswrapper[4822]: E1010 06:24:15.222036 4822 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.584451 4822 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.664303 4822 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540" exitCode=0 Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.664391 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.664519 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.666301 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.666355 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.666370 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.667576 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.667599 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.667610 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.669747 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee" exitCode=0 Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.669872 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.670141 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.671539 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.671576 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.671591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.673951 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.674970 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.674996 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.675008 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.675965 4822 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="182e0f89b8f3646cc06497e0c6b03697a690f0b702eb6d4c2b9d793fa4b4788a" exitCode=0 Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.676045 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"182e0f89b8f3646cc06497e0c6b03697a690f0b702eb6d4c2b9d793fa4b4788a"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.676171 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.677940 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.677968 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.677980 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.684764 4822 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4" exitCode=0 Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.684846 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4"} Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.685094 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.686465 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.686512 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:15 crc kubenswrapper[4822]: I1010 06:24:15.686524 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.584371 4822 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:16 crc kubenswrapper[4822]: E1010 06:24:16.593980 4822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="3.2s" Oct 10 06:24:16 crc kubenswrapper[4822]: W1010 06:24:16.609769 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:16 crc kubenswrapper[4822]: E1010 06:24:16.610424 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.690084 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"137f964799e4e133b15ded88d77b866d8f21a6ca1629315307693070e9248b7a"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.690224 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.691139 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.691173 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.691186 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.697542 4822 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca" exitCode=0 Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.697590 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.697611 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.702363 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.702409 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.702418 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.710384 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.710437 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.710460 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.710514 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.711596 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.711641 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.711654 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.713075 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.713114 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.714363 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.714397 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.714410 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.716208 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.716281 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.716296 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.716310 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7"} Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.822579 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.827232 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.827274 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.827286 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:16 crc kubenswrapper[4822]: I1010 06:24:16.827312 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:16 crc kubenswrapper[4822]: E1010 06:24:16.827829 4822 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 10 06:24:17 crc kubenswrapper[4822]: W1010 06:24:17.121931 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:17 crc kubenswrapper[4822]: E1010 06:24:17.122171 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:17 crc kubenswrapper[4822]: W1010 06:24:17.207603 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 10 06:24:17 crc kubenswrapper[4822]: E1010 06:24:17.207704 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.721193 4822 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a" exitCode=0 Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.721295 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a"} Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.721464 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.723227 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.723271 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.723291 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.727024 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.728896 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06" exitCode=255 Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.728995 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.729036 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.729652 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.729974 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730013 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730123 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730168 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730181 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730286 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730329 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730341 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730734 4822 scope.go:117] "RemoveContainer" containerID="91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.730914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06"} Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.731343 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.731410 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.731423 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.731445 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.731494 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:17 crc kubenswrapper[4822]: I1010 06:24:17.731505 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.524440 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.737107 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d"} Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.737164 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7"} Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.737184 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927"} Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.737200 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145"} Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.739558 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.741440 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.742239 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.742723 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725"} Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.742776 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.743229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.743261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.743277 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.744112 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.744207 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:18 crc kubenswrapper[4822]: I1010 06:24:18.744221 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.587416 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.587704 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.589195 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.589229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.589240 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.636970 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.748948 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5"} Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.749083 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.749097 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.749160 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.750463 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.750501 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.750513 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.750967 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.751000 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:19 crc kubenswrapper[4822]: I1010 06:24:19.751013 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.028220 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.029912 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.029963 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.029972 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.030008 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.311366 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.311613 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.313399 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.313444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.313460 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.751665 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.751665 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.753566 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.753630 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.753645 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.754082 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.754117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:20 crc kubenswrapper[4822]: I1010 06:24:20.754138 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:21 crc kubenswrapper[4822]: I1010 06:24:21.447502 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:21 crc kubenswrapper[4822]: I1010 06:24:21.447709 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:21 crc kubenswrapper[4822]: I1010 06:24:21.449134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:21 crc kubenswrapper[4822]: I1010 06:24:21.449175 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:21 crc kubenswrapper[4822]: I1010 06:24:21.449186 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:22 crc kubenswrapper[4822]: I1010 06:24:22.263592 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:22 crc kubenswrapper[4822]: I1010 06:24:22.264026 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:22 crc kubenswrapper[4822]: I1010 06:24:22.265887 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:22 crc kubenswrapper[4822]: I1010 06:24:22.265955 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:22 crc kubenswrapper[4822]: I1010 06:24:22.265979 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.313341 4822 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.313455 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 06:24:23 crc kubenswrapper[4822]: E1010 06:24:23.719419 4822 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.826448 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.826636 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.827979 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.828029 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.828049 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.850613 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.958741 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.958976 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.960589 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.960662 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:23 crc kubenswrapper[4822]: I1010 06:24:23.960679 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.428187 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.763387 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.763387 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.764737 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.764942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.765079 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.766225 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.766394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.766452 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:24 crc kubenswrapper[4822]: I1010 06:24:24.767481 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:25 crc kubenswrapper[4822]: I1010 06:24:25.766172 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:25 crc kubenswrapper[4822]: I1010 06:24:25.767339 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:25 crc kubenswrapper[4822]: I1010 06:24:25.767382 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:25 crc kubenswrapper[4822]: I1010 06:24:25.767393 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:27 crc kubenswrapper[4822]: I1010 06:24:27.586002 4822 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 10 06:24:27 crc kubenswrapper[4822]: W1010 06:24:27.818471 4822 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 10 06:24:27 crc kubenswrapper[4822]: I1010 06:24:27.818589 4822 trace.go:236] Trace[659987522]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:24:17.817) (total time: 10001ms): Oct 10 06:24:27 crc kubenswrapper[4822]: Trace[659987522]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:24:27.818) Oct 10 06:24:27 crc kubenswrapper[4822]: Trace[659987522]: [10.001402686s] [10.001402686s] END Oct 10 06:24:27 crc kubenswrapper[4822]: E1010 06:24:27.818622 4822 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 10 06:24:27 crc kubenswrapper[4822]: E1010 06:24:27.862133 4822 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.186d0d1e66031952 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-10 06:24:13.583391058 +0000 UTC m=+0.678549264,LastTimestamp:2025-10-10 06:24:13.583391058 +0000 UTC m=+0.678549264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 10 06:24:28 crc kubenswrapper[4822]: I1010 06:24:28.561626 4822 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 10 06:24:28 crc kubenswrapper[4822]: I1010 06:24:28.561694 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 10 06:24:28 crc kubenswrapper[4822]: I1010 06:24:28.565642 4822 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 10 06:24:28 crc kubenswrapper[4822]: I1010 06:24:28.565723 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.116934 4822 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.271694 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.271890 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.273152 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.273194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.273208 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.276938 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.784413 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.788516 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.788584 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:32 crc kubenswrapper[4822]: I1010 06:24:32.788600 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.311816 4822 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.312583 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.557575 4822 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.558791 4822 trace.go:236] Trace[1846401869]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:24:20.625) (total time: 12933ms): Oct 10 06:24:33 crc kubenswrapper[4822]: Trace[1846401869]: ---"Objects listed" error: 12933ms (06:24:33.558) Oct 10 06:24:33 crc kubenswrapper[4822]: Trace[1846401869]: [12.933103045s] [12.933103045s] END Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.558857 4822 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.560532 4822 trace.go:236] Trace[992817394]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:24:23.297) (total time: 10263ms): Oct 10 06:24:33 crc kubenswrapper[4822]: Trace[992817394]: ---"Objects listed" error: 10263ms (06:24:33.560) Oct 10 06:24:33 crc kubenswrapper[4822]: Trace[992817394]: [10.263173205s] [10.263173205s] END Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.560555 4822 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.560601 4822 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.562443 4822 trace.go:236] Trace[446270646]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:24:21.570) (total time: 11992ms): Oct 10 06:24:33 crc kubenswrapper[4822]: Trace[446270646]: ---"Objects listed" error: 11992ms (06:24:33.562) Oct 10 06:24:33 crc kubenswrapper[4822]: Trace[446270646]: [11.992291313s] [11.992291313s] END Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.562455 4822 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.562468 4822 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.574615 4822 apiserver.go:52] "Watching apiserver" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.578361 4822 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.578834 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.579314 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.579392 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.579485 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.579671 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.579731 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.579548 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.580084 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.580262 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.580981 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.582472 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.582956 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.583006 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.583500 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.583672 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.583736 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.584004 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.584048 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.584301 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.588260 4822 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.605772 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.627040 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.643820 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.661783 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.661851 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.661879 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.662731 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.662761 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.662845 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663071 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663259 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.661901 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663562 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663591 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663617 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663638 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663663 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663684 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664057 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.663708 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664136 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664161 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664181 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664205 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664209 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664228 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664439 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664509 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664847 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664884 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664906 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664929 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664951 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664975 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.664998 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665055 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665082 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665108 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665130 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665153 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665176 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665199 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665224 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665245 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665265 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665288 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665312 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665335 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665357 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665381 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665404 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665423 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665445 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665475 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665500 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665520 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665540 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665560 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665581 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665884 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665906 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665928 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665949 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665968 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665991 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666106 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666137 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666161 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666181 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666230 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666248 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666267 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666288 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666311 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666333 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666354 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666374 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666398 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666420 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666442 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666464 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667933 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667988 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668009 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668034 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668055 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668073 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668091 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668111 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668128 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668145 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668162 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668179 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668200 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668219 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668236 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668254 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668270 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668288 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668305 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668321 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668335 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668350 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668365 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668380 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668395 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668411 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668432 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668451 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668485 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668503 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668519 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668534 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668559 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668578 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668603 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668632 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668646 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668662 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668679 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668694 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668709 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668726 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668740 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668775 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668791 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668822 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668838 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668854 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668872 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668897 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668917 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668932 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668947 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668962 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668979 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.668996 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669012 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669029 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669048 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669066 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669083 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669100 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669117 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669133 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669152 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669167 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669253 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669272 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669290 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669306 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669323 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669341 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669359 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669381 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669404 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669425 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669446 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669469 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669490 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669511 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669535 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669558 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669579 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669601 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669623 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669644 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669663 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669679 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669695 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669711 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669728 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669744 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669759 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669775 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.669792 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670009 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670043 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670061 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670091 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670130 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670148 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670165 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670183 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670201 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670218 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670236 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670258 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670280 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670297 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670313 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670330 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670348 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670369 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670387 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670404 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670421 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670453 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670481 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670498 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670515 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670533 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670559 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670576 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670593 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670610 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670656 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670723 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670746 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670776 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670829 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670849 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670868 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670888 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670906 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670926 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670943 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670963 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670982 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671046 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671058 4822 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671070 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671079 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671090 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671100 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671110 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671119 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665609 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.665862 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666009 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666046 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666274 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666610 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.666913 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667195 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667637 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674541 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667708 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.667863 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.670263 4822 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58668->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671168 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671278 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671585 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671616 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671669 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671876 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.671955 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672000 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672054 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672154 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672373 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672377 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672344 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672752 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.672961 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.673027 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.673202 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.673297 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.673424 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.673766 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.673930 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674104 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674261 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674351 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674375 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674634 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674830 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.674883 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.675181 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.675212 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.675497 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.675593 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.675623 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:24:34.174735556 +0000 UTC m=+21.269893832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.675729 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.675935 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.676460 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.676548 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.676933 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.677023 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.676893 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58668->192.168.126.11:17697: read: connection reset by peer" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.677837 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.677922 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.678021 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.678110 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.678250 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.678401 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.678764 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.678967 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679074 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679106 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679158 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679265 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679308 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679436 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.679989 4822 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.680188 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.680232 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.680486 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.680597 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.681033 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.681075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.682941 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.683163 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.684193 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.684260 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.684496 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.684625 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685147 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685354 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685373 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685385 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685602 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685636 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.685879 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.686403 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.686247 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.686460 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.686488 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.687015 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.687258 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.687321 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.687640 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.687877 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.687907 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688111 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688130 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688175 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688509 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688569 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688750 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688846 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.688923 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.689160 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.689239 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.689267 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.689395 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.689510 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.689857 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.690061 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.690096 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.690342 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.690771 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.690831 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.691019 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.691251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.691287 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.691658 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.691761 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.691772 4822 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.692024 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.692406 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.692342 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.693669 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.694332 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.694549 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.694616 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:34.194590894 +0000 UTC m=+21.289749300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.695075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.695106 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.695126 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.695128 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:34.195108569 +0000 UTC m=+21.290266765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.695139 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.695130 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.694788 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.695258 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.695447 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:34.195435278 +0000 UTC m=+21.290593474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.695984 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.696180 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.696632 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698415 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.697550 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698431 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.697658 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.697845 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698038 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698071 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698243 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698360 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.698708 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.701847 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.701876 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.702859 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.702929 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.703562 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.704033 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.704307 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.705312 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.705833 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.706046 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.706840 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.706952 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.707306 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.707544 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.707606 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.708077 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.708124 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.708203 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.709641 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.709677 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.709694 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.709755 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:34.209733437 +0000 UTC m=+21.304891653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.709795 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.710619 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.714228 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.714585 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.716453 4822 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.716507 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.716846 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.718010 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.718050 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.718519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.719735 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.722062 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.722777 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.723654 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.723655 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.724343 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.726051 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.726258 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.726484 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.726502 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.726874 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.728092 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.731279 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.731656 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.731915 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.732503 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.732687 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.732742 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.733318 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.733355 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.733409 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.733550 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.733792 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.734058 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.734115 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.734983 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.735692 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.735748 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.735826 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.736633 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.747197 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.748143 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.763081 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.771604 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.771638 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.772083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.771725 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.772241 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773106 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773134 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773196 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773212 4822 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773222 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773240 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773249 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773268 4822 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773277 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773299 4822 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773311 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773331 4822 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773351 4822 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773372 4822 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773393 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773407 4822 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773418 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773429 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773461 4822 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773472 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773510 4822 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773536 4822 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773546 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773556 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773571 4822 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773582 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773607 4822 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773634 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773662 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773674 4822 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773685 4822 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773707 4822 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773730 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773757 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773770 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773820 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.773983 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774005 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774024 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774045 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774063 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774073 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774082 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774092 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774112 4822 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774130 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774139 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774157 4822 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774166 4822 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774185 4822 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774203 4822 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774211 4822 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774282 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774292 4822 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774328 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774337 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774346 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774356 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774365 4822 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774392 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774412 4822 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774181 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774423 4822 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774631 4822 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774645 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774658 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774670 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774682 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774694 4822 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774709 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774720 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774731 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774743 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774755 4822 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774766 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774777 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774790 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774831 4822 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774844 4822 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774856 4822 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774869 4822 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774880 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774891 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774903 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774914 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774926 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774938 4822 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774951 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774963 4822 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774976 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.774988 4822 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775002 4822 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775014 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775026 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775039 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775053 4822 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775069 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775081 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775092 4822 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775105 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775124 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775137 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775149 4822 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775161 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775173 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775185 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775197 4822 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775208 4822 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775221 4822 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775232 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775243 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775258 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775270 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775283 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775295 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775308 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775319 4822 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775332 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775345 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775358 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775369 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775382 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775396 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775408 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775421 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775432 4822 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775443 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775455 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775467 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775479 4822 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775491 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775502 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775514 4822 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775525 4822 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775536 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775547 4822 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775559 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775570 4822 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775581 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775594 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775606 4822 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775618 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775630 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775643 4822 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775655 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775667 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775679 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775692 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775703 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775715 4822 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775728 4822 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775740 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775753 4822 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775766 4822 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775780 4822 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775793 4822 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775826 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775837 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775851 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775864 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775878 4822 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775891 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775904 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775917 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775930 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775941 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775952 4822 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775965 4822 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775976 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.775994 4822 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776006 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776018 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776042 4822 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776054 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776065 4822 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776079 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776092 4822 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776105 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776118 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776132 4822 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776145 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776161 4822 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776174 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776186 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776198 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.776213 4822 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.782119 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.787261 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.791073 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.791706 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.794157 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725" exitCode=255 Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.794211 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725"} Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.794262 4822 scope.go:117] "RemoveContainer" containerID="91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.805032 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.815424 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.825178 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.836501 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.847444 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.859134 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.863108 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.863494 4822 scope.go:117] "RemoveContainer" containerID="b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725" Oct 10 06:24:33 crc kubenswrapper[4822]: E1010 06:24:33.863879 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.870451 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.877518 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.882293 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.898604 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.907502 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:24:33 crc kubenswrapper[4822]: W1010 06:24:33.911952 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a2810a480265ca28f0389f070191358a9f25d1d4a28b33ccee1603bbf0e3981c WatchSource:0}: Error finding container a2810a480265ca28f0389f070191358a9f25d1d4a28b33ccee1603bbf0e3981c: Status 404 returned error can't find the container with id a2810a480265ca28f0389f070191358a9f25d1d4a28b33ccee1603bbf0e3981c Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.912763 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:24:33 crc kubenswrapper[4822]: W1010 06:24:33.925928 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-543056bc9d16cb2366326dd81b33982398052a2f0339c85bfc47e9137936aa71 WatchSource:0}: Error finding container 543056bc9d16cb2366326dd81b33982398052a2f0339c85bfc47e9137936aa71: Status 404 returned error can't find the container with id 543056bc9d16cb2366326dd81b33982398052a2f0339c85bfc47e9137936aa71 Oct 10 06:24:33 crc kubenswrapper[4822]: I1010 06:24:33.994862 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.022889 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.032985 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.036468 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.062943 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.095077 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.119130 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"message\\\":\\\"W1010 06:24:16.888506 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1010 06:24:16.888956 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760077456 cert, and key in /tmp/serving-cert-3050225769/serving-signer.crt, /tmp/serving-cert-3050225769/serving-signer.key\\\\nI1010 06:24:17.127952 1 observer_polling.go:159] Starting file observer\\\\nW1010 06:24:17.131223 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1010 06:24:17.131373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:17.132751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3050225769/tls.crt::/tmp/serving-cert-3050225769/tls.key\\\\\\\"\\\\nF1010 06:24:17.370441 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.138517 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.154166 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.172413 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.183120 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.183282 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:24:35.183265292 +0000 UTC m=+22.278423488 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.196489 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"message\\\":\\\"W1010 06:24:16.888506 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1010 06:24:16.888956 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760077456 cert, and key in /tmp/serving-cert-3050225769/serving-signer.crt, /tmp/serving-cert-3050225769/serving-signer.key\\\\nI1010 06:24:17.127952 1 observer_polling.go:159] Starting file observer\\\\nW1010 06:24:17.131223 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1010 06:24:17.131373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:17.132751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3050225769/tls.crt::/tmp/serving-cert-3050225769/tls.key\\\\\\\"\\\\nF1010 06:24:17.370441 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.234018 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.247613 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.261677 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.276650 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.284225 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.284276 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.284303 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.284322 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284435 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284496 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:35.284479847 +0000 UTC m=+22.379638043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284514 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284549 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284598 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284615 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284619 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:35.28459341 +0000 UTC m=+22.379751606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284698 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:35.284667462 +0000 UTC m=+22.379825659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284753 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284812 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284831 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.284912 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:35.284888989 +0000 UTC m=+22.380047365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.308757 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.352254 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.365273 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.692999 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-889pc"] Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.693520 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.697502 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.697709 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.713774 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.731841 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.745384 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.761347 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.785137 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.788516 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2z7\" (UniqueName: \"kubernetes.io/projected/2f12f36f-6d53-433a-9604-e70600ccdbe6-kube-api-access-xd2z7\") pod \"node-resolver-889pc\" (UID: \"2f12f36f-6d53-433a-9604-e70600ccdbe6\") " pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.788591 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f12f36f-6d53-433a-9604-e70600ccdbe6-hosts-file\") pod \"node-resolver-889pc\" (UID: \"2f12f36f-6d53-433a-9604-e70600ccdbe6\") " pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.797016 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.798087 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e"} Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.798141 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a2810a480265ca28f0389f070191358a9f25d1d4a28b33ccee1603bbf0e3981c"} Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.799911 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.801923 4822 scope.go:117] "RemoveContainer" containerID="b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725" Oct 10 06:24:34 crc kubenswrapper[4822]: E1010 06:24:34.802093 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.803024 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"543056bc9d16cb2366326dd81b33982398052a2f0339c85bfc47e9137936aa71"} Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.804860 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902"} Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.804898 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248"} Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.804912 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1299191a056018f23a516dac439514da111ef3bd42bc724d699ca15b2a996584"} Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.818415 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91deee592cba07360f93e2a968718215f74aedfe5f298858b9992a2dbbe59c06\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"message\\\":\\\"W1010 06:24:16.888506 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1010 06:24:16.888956 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760077456 cert, and key in /tmp/serving-cert-3050225769/serving-signer.crt, /tmp/serving-cert-3050225769/serving-signer.key\\\\nI1010 06:24:17.127952 1 observer_polling.go:159] Starting file observer\\\\nW1010 06:24:17.131223 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1010 06:24:17.131373 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:17.132751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3050225769/tls.crt::/tmp/serving-cert-3050225769/tls.key\\\\\\\"\\\\nF1010 06:24:17.370441 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.831417 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.851952 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.870422 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.890154 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.890967 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2z7\" (UniqueName: \"kubernetes.io/projected/2f12f36f-6d53-433a-9604-e70600ccdbe6-kube-api-access-xd2z7\") pod \"node-resolver-889pc\" (UID: \"2f12f36f-6d53-433a-9604-e70600ccdbe6\") " pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.891337 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f12f36f-6d53-433a-9604-e70600ccdbe6-hosts-file\") pod \"node-resolver-889pc\" (UID: \"2f12f36f-6d53-433a-9604-e70600ccdbe6\") " pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.891417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f12f36f-6d53-433a-9604-e70600ccdbe6-hosts-file\") pod \"node-resolver-889pc\" (UID: \"2f12f36f-6d53-433a-9604-e70600ccdbe6\") " pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.910757 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.916770 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2z7\" (UniqueName: \"kubernetes.io/projected/2f12f36f-6d53-433a-9604-e70600ccdbe6-kube-api-access-xd2z7\") pod \"node-resolver-889pc\" (UID: \"2f12f36f-6d53-433a-9604-e70600ccdbe6\") " pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.926588 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.940634 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.954663 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.975642 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:34 crc kubenswrapper[4822]: I1010 06:24:34.992064 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.012332 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.021391 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-889pc" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.026015 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.092380 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w2fl5"] Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.093166 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nrdcs"] Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.093595 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5x2kt"] Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.093888 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.094264 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.094664 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.098738 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.099267 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.099421 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.099444 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.099633 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.099947 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.100526 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.100666 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.100994 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.101155 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.101194 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.104096 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.166536 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194126 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194226 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-os-release\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194265 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-cnibin\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.194294 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:24:37.194275841 +0000 UTC m=+24.289434037 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194318 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-cni-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cnibin\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194359 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194380 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4hhj\" (UniqueName: \"kubernetes.io/projected/86167202-f72a-4271-bdbe-32ba0bf71fff-kube-api-access-t4hhj\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194397 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-k8s-cni-cncf-io\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194413 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-cni-bin\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194482 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-kubelet\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194534 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-hostroot\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194587 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-etc-kubernetes\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194656 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/86167202-f72a-4271-bdbe-32ba0bf71fff-rootfs\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194691 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-system-cni-dir\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194715 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194748 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86167202-f72a-4271-bdbe-32ba0bf71fff-mcd-auth-proxy-config\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194779 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-conf-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194829 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-multus-certs\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194859 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-netns\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194880 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9t9\" (UniqueName: \"kubernetes.io/projected/ec9c77cf-dd02-4e39-b204-9f6540406973-kube-api-access-6z9t9\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194921 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8kf\" (UniqueName: \"kubernetes.io/projected/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-kube-api-access-cq8kf\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194963 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-system-cni-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.194982 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec9c77cf-dd02-4e39-b204-9f6540406973-cni-binary-copy\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.195004 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-cni-multus\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.195035 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-os-release\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.195069 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.195092 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-socket-dir-parent\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.195114 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-daemon-config\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.195192 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86167202-f72a-4271-bdbe-32ba0bf71fff-proxy-tls\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.210122 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.254413 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.281033 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295853 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-cni-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295889 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cnibin\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295906 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295925 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-cni-bin\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295943 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-kubelet\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295965 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-hostroot\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.295991 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4hhj\" (UniqueName: \"kubernetes.io/projected/86167202-f72a-4271-bdbe-32ba0bf71fff-kube-api-access-t4hhj\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296013 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-k8s-cni-cncf-io\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296035 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-etc-kubernetes\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296059 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296082 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/86167202-f72a-4271-bdbe-32ba0bf71fff-rootfs\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296104 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86167202-f72a-4271-bdbe-32ba0bf71fff-mcd-auth-proxy-config\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296126 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-system-cni-dir\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296146 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296167 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-conf-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296187 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-multus-certs\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296207 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-netns\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296318 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cnibin\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296372 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-k8s-cni-cncf-io\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296405 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-cni-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296421 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-system-cni-dir\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-kubelet\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296453 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-conf-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296516 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-hostroot\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296541 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-etc-kubernetes\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.296609 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.296662 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:37.296643289 +0000 UTC m=+24.391801485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296694 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/86167202-f72a-4271-bdbe-32ba0bf71fff-rootfs\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296953 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-multus-certs\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.296989 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-run-netns\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297254 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-cni-bin\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297460 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297301 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9t9\" (UniqueName: \"kubernetes.io/projected/ec9c77cf-dd02-4e39-b204-9f6540406973-kube-api-access-6z9t9\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297539 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297544 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-cni-multus\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297575 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-host-var-lib-cni-multus\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297611 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86167202-f72a-4271-bdbe-32ba0bf71fff-mcd-auth-proxy-config\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297624 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297658 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8kf\" (UniqueName: \"kubernetes.io/projected/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-kube-api-access-cq8kf\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297708 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-system-cni-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297727 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec9c77cf-dd02-4e39-b204-9f6540406973-cni-binary-copy\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297749 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-daemon-config\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-system-cni-dir\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.298109 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.298140 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.298159 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.298223 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:37.298206894 +0000 UTC m=+24.393365290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.298359 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-os-release\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.298392 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec9c77cf-dd02-4e39-b204-9f6540406973-cni-binary-copy\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.298513 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-daemon-config\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.297770 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-os-release\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.298609 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.298636 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-socket-dir-parent\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299192 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299244 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86167202-f72a-4271-bdbe-32ba0bf71fff-proxy-tls\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299267 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-os-release\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299291 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299322 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-cnibin\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299391 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-cnibin\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.298758 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-multus-socket-dir-parent\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.299473 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.299517 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:37.299505101 +0000 UTC m=+24.394663297 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.299145 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.300000 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec9c77cf-dd02-4e39-b204-9f6540406973-os-release\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.300081 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.300100 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.300114 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.300150 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:37.300141979 +0000 UTC m=+24.395300175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.304302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86167202-f72a-4271-bdbe-32ba0bf71fff-proxy-tls\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.308295 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.326635 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8kf\" (UniqueName: \"kubernetes.io/projected/1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3-kube-api-access-cq8kf\") pod \"multus-additional-cni-plugins-nrdcs\" (UID: \"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\") " pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.326657 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4hhj\" (UniqueName: \"kubernetes.io/projected/86167202-f72a-4271-bdbe-32ba0bf71fff-kube-api-access-t4hhj\") pod \"machine-config-daemon-w2fl5\" (UID: \"86167202-f72a-4271-bdbe-32ba0bf71fff\") " pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.330385 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9t9\" (UniqueName: \"kubernetes.io/projected/ec9c77cf-dd02-4e39-b204-9f6540406973-kube-api-access-6z9t9\") pod \"multus-5x2kt\" (UID: \"ec9c77cf-dd02-4e39-b204-9f6540406973\") " pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.339099 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.358861 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.380624 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.398997 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.408701 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5x2kt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.411698 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.415832 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:24:35 crc kubenswrapper[4822]: W1010 06:24:35.419825 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9c77cf_dd02_4e39_b204_9f6540406973.slice/crio-09be46f0eb9f435441c91871446993d415ce6fdac70cbcd1625a0f67fef6f30a WatchSource:0}: Error finding container 09be46f0eb9f435441c91871446993d415ce6fdac70cbcd1625a0f67fef6f30a: Status 404 returned error can't find the container with id 09be46f0eb9f435441c91871446993d415ce6fdac70cbcd1625a0f67fef6f30a Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.420923 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.426582 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: W1010 06:24:35.432365 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86167202_f72a_4271_bdbe_32ba0bf71fff.slice/crio-c48995675f13ed365b3d2fff5d2b37598480583d8471ba283c224457ef46a770 WatchSource:0}: Error finding container c48995675f13ed365b3d2fff5d2b37598480583d8471ba283c224457ef46a770: Status 404 returned error can't find the container with id c48995675f13ed365b3d2fff5d2b37598480583d8471ba283c224457ef46a770 Oct 10 06:24:35 crc kubenswrapper[4822]: W1010 06:24:35.441590 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4a6cb7_61d8_4c6b_bb8d_2eed42f0b0b3.slice/crio-c0cf0a146456fcda744462b0d7ae91562766192b67e3205b24386b267b804a09 WatchSource:0}: Error finding container c0cf0a146456fcda744462b0d7ae91562766192b67e3205b24386b267b804a09: Status 404 returned error can't find the container with id c0cf0a146456fcda744462b0d7ae91562766192b67e3205b24386b267b804a09 Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.444007 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.460746 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.479868 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.500338 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.514145 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.516432 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bzbn"] Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.517664 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.519779 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.521101 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.521525 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.521591 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.521752 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.521544 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.521934 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.539448 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.556972 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.574911 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.598261 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.614966 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.631382 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.649869 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.649929 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.650006 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.650081 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.650104 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:35 crc kubenswrapper[4822]: E1010 06:24:35.650196 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.652235 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.653913 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.654560 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.655406 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.656453 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.657141 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.658063 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.658737 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.659705 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.660427 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.661526 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.662156 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.663474 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.664011 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.664924 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.666073 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.666644 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.668400 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.669072 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.669680 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.670348 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.670826 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.671478 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.673177 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.673902 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.675015 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.675822 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.677124 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.677719 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.678448 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.680895 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.682530 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.683145 4822 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.683277 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.685957 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.686783 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.687309 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.694913 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.695959 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.696646 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.701840 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704175 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-env-overrides\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704231 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-var-lib-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704256 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovn-node-metrics-cert\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704276 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-netns\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704292 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704308 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704326 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-etc-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704359 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-systemd\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704376 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-bin\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704403 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-kubelet\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704416 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-systemd-units\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704432 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-node-log\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704447 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-config\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.704461 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngm6\" (UniqueName: \"kubernetes.io/projected/2bd611ad-9a8c-489f-903b-d75912bb1fef-kube-api-access-cngm6\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.709372 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-ovn\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.709456 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-slash\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.709479 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-netd\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.709503 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-log-socket\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.709531 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-script-lib\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.713292 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.714266 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.720997 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.722045 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.725225 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.725995 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.728140 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.728839 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.733084 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.733832 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.739128 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.739732 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.740412 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.742487 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.743181 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.746151 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.746578 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.770067 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.809773 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-889pc" event={"ID":"2f12f36f-6d53-433a-9604-e70600ccdbe6","Type":"ContainerStarted","Data":"429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.809834 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-889pc" event={"ID":"2f12f36f-6d53-433a-9604-e70600ccdbe6","Type":"ContainerStarted","Data":"b1c75d2a7cdef1aec541fbd4611572e2d55031ce4ffcb93631fdeb05118ed356"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.812264 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.812323 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.812337 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"c48995675f13ed365b3d2fff5d2b37598480583d8471ba283c224457ef46a770"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813369 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-etc-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813421 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813452 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-systemd\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813475 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-bin\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813495 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-kubelet\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813513 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-systemd-units\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813506 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-etc-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813570 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-systemd-units\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813592 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813631 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-kubelet\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813639 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-bin\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813617 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-node-log\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813617 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-systemd\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813848 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-node-log\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813936 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-config\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.813987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngm6\" (UniqueName: \"kubernetes.io/projected/2bd611ad-9a8c-489f-903b-d75912bb1fef-kube-api-access-cngm6\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814457 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-ovn\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814500 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-ovn\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814528 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-slash\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814557 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-netd\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814586 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-log-socket\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814628 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-script-lib\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814661 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-env-overrides\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814689 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-var-lib-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814718 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-netns\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814735 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814764 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovn-node-metrics-cert\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814816 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814874 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814890 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-config\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814905 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-slash\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814935 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-log-socket\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814944 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-netd\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.814987 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-var-lib-openvswitch\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.815005 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-netns\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.815321 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.815586 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-script-lib\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.815956 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-env-overrides\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.816514 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerStarted","Data":"9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.816559 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerStarted","Data":"09be46f0eb9f435441c91871446993d415ce6fdac70cbcd1625a0f67fef6f30a"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.817838 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.818892 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerStarted","Data":"8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.818922 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerStarted","Data":"c0cf0a146456fcda744462b0d7ae91562766192b67e3205b24386b267b804a09"} Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.825852 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovn-node-metrics-cert\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.837417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngm6\" (UniqueName: \"kubernetes.io/projected/2bd611ad-9a8c-489f-903b-d75912bb1fef-kube-api-access-cngm6\") pod \"ovnkube-node-6bzbn\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.869484 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.907632 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.949922 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:35 crc kubenswrapper[4822]: I1010 06:24:35.989205 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.026443 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.072977 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.107736 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.134960 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:36 crc kubenswrapper[4822]: W1010 06:24:36.149568 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd611ad_9a8c_489f_903b_d75912bb1fef.slice/crio-22e0e2b44f19b33be05e55d5b2b4b4d8ce58be4385f4852b106bbb4496f19628 WatchSource:0}: Error finding container 22e0e2b44f19b33be05e55d5b2b4b4d8ce58be4385f4852b106bbb4496f19628: Status 404 returned error can't find the container with id 22e0e2b44f19b33be05e55d5b2b4b4d8ce58be4385f4852b106bbb4496f19628 Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.157233 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.199521 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.229078 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.273882 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.346147 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.376761 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.408459 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.433836 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.465955 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.514238 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.548523 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.595833 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.632006 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.668161 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.707161 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.824007 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" exitCode=0 Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.824099 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.824131 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"22e0e2b44f19b33be05e55d5b2b4b4d8ce58be4385f4852b106bbb4496f19628"} Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.829068 4822 generic.go:334] "Generic (PLEG): container finished" podID="1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3" containerID="8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135" exitCode=0 Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.829171 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerDied","Data":"8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135"} Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.842065 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.872776 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.897163 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.919527 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.938417 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.955451 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:36 crc kubenswrapper[4822]: I1010 06:24:36.987668 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.028254 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.068068 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.107648 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.153471 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.187230 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.226226 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.228513 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.228635 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:24:41.22861878 +0000 UTC m=+28.323776976 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.266518 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.310983 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.329226 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.329277 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.329301 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.329354 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329471 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329521 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329524 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329555 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329564 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329540 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329628 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:41.329609559 +0000 UTC m=+28.424767745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329699 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:41.329670461 +0000 UTC m=+28.424828837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329572 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329482 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329838 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:41.329795895 +0000 UTC m=+28.424954091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.329860 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:41.329850136 +0000 UTC m=+28.425008532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.347034 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.386303 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.426960 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.470335 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.508929 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.552936 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.595410 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.635629 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.649319 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.649428 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.649477 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.649595 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.649715 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:37 crc kubenswrapper[4822]: E1010 06:24:37.649893 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.669888 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.708192 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.749644 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.833426 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kwt79"] Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.833864 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.836728 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.837191 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.837454 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.839068 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.840034 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerStarted","Data":"0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.844896 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.844934 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.844948 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.844958 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.844968 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.847941 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64"} Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.877282 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.911840 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:37 crc kubenswrapper[4822]: I1010 06:24:37.969531 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.007730 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:37Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.026164 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.035316 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46a09c3-aa28-4097-bd15-5fe82d308dad-host\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.035379 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44qp\" (UniqueName: \"kubernetes.io/projected/e46a09c3-aa28-4097-bd15-5fe82d308dad-kube-api-access-k44qp\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.035453 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e46a09c3-aa28-4097-bd15-5fe82d308dad-serviceca\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.066208 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.110989 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.138059 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44qp\" (UniqueName: \"kubernetes.io/projected/e46a09c3-aa28-4097-bd15-5fe82d308dad-kube-api-access-k44qp\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.138458 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e46a09c3-aa28-4097-bd15-5fe82d308dad-serviceca\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.138747 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46a09c3-aa28-4097-bd15-5fe82d308dad-host\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.138871 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46a09c3-aa28-4097-bd15-5fe82d308dad-host\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.140197 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e46a09c3-aa28-4097-bd15-5fe82d308dad-serviceca\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.148277 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.180710 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44qp\" (UniqueName: \"kubernetes.io/projected/e46a09c3-aa28-4097-bd15-5fe82d308dad-kube-api-access-k44qp\") pod \"node-ca-kwt79\" (UID: \"e46a09c3-aa28-4097-bd15-5fe82d308dad\") " pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.208501 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.246919 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.287008 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.327794 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.372749 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.405258 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.453219 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.472141 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kwt79" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.497258 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.529112 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.568344 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.606051 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.649626 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.688669 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.727435 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.768339 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.805866 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.848273 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.853405 4822 generic.go:334] "Generic (PLEG): container finished" podID="1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3" containerID="0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902" exitCode=0 Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.853510 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerDied","Data":"0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902"} Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.858385 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.860310 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kwt79" event={"ID":"e46a09c3-aa28-4097-bd15-5fe82d308dad","Type":"ContainerStarted","Data":"72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008"} Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.860340 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kwt79" event={"ID":"e46a09c3-aa28-4097-bd15-5fe82d308dad","Type":"ContainerStarted","Data":"80d435d431fc4221d008478c3f6ae1b4f23b4f4cb59766b102ca3039836964a9"} Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.900122 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.931680 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:38 crc kubenswrapper[4822]: I1010 06:24:38.966282 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.009178 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.050726 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.094510 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.128632 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.165503 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.206762 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.247719 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.293295 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.343315 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.366406 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.411464 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.454215 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.487780 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.532482 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.649710 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.649763 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.649822 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:39 crc kubenswrapper[4822]: E1010 06:24:39.649906 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:39 crc kubenswrapper[4822]: E1010 06:24:39.650007 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:39 crc kubenswrapper[4822]: E1010 06:24:39.650122 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.864989 4822 generic.go:334] "Generic (PLEG): container finished" podID="1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3" containerID="9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90" exitCode=0 Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.865043 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerDied","Data":"9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90"} Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.885752 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.899787 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.917444 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.933196 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.949952 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.963387 4822 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.966332 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.967635 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.967669 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.967682 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.967853 4822 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.975107 4822 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.975459 4822 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.977730 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.977759 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.977768 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.977783 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.977793 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:39Z","lastTransitionTime":"2025-10-10T06:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.981036 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: E1010 06:24:39.993770 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:39 crc kubenswrapper[4822]: I1010 06:24:39.994160 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.005390 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.005443 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.005461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.005482 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.005496 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.010166 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: E1010 06:24:40.020098 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.025548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.025765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.025851 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.025981 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.026046 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.033287 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: E1010 06:24:40.042214 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.046627 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.046825 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.046903 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.047012 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.047094 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.055912 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: E1010 06:24:40.063543 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.067982 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.068032 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.068050 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.068074 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.068086 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.075927 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: E1010 06:24:40.087884 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: E1010 06:24:40.088417 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.094062 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.094182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.094434 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.094556 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.094666 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.094765 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.129956 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.197737 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.198046 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.198110 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.198202 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.198260 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.301852 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.301897 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.301909 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.301929 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.301939 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.315792 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.319320 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.324691 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.336200 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.352041 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.365665 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.380230 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.393569 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.405614 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.405655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.405664 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.405680 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.405691 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.407726 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.427128 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.470475 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.509142 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.509259 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.509320 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.509330 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.509345 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.509358 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.546021 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.585959 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.611398 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.611461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.611473 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.611493 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.611506 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.628489 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.674020 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.707702 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.714724 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.714767 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.714787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.714845 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.714860 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.749353 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.789325 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.817084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.817136 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.817147 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.817169 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.817182 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.827235 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.867973 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.871017 4822 generic.go:334] "Generic (PLEG): container finished" podID="1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3" containerID="a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339" exitCode=0 Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.871127 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerDied","Data":"a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.878649 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.910704 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.920403 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.920457 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.920469 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.920490 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.920502 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:40Z","lastTransitionTime":"2025-10-10T06:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.949263 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:40 crc kubenswrapper[4822]: I1010 06:24:40.987689 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.023218 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.023268 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.023279 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.023298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.023313 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.029024 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.069741 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.107215 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.126736 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.126794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.126825 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.126844 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.126854 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.146476 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.192080 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.224447 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.229095 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.229134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.229146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.229166 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.229178 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.267024 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.276476 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.276940 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:24:49.276920437 +0000 UTC m=+36.372078653 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.311735 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.331988 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.332021 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.332031 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.332045 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.332055 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.351835 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.376999 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.377048 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.377078 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.377097 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377136 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377176 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377188 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:49.377173995 +0000 UTC m=+36.472332191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377190 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377205 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:49.377196305 +0000 UTC m=+36.472354501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377210 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377225 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377270 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:49.377259627 +0000 UTC m=+36.472417823 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377367 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377406 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377424 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.377509 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:24:49.377479664 +0000 UTC m=+36.472637860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.390912 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.429391 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.434649 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.434684 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.434694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.434710 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.434726 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.467779 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.507839 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.537832 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.537880 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.537893 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.537910 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.537924 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.547614 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.593708 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.627088 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.642037 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.642115 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.642141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.642202 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.642227 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.649300 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.649366 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.649498 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.649538 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.649689 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:41 crc kubenswrapper[4822]: E1010 06:24:41.649845 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.670234 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.711928 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.745010 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.745347 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.745441 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.745536 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.745617 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.763131 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.792400 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.831692 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.848873 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.848908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.848919 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.848935 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.848948 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.865074 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.885768 4822 generic.go:334] "Generic (PLEG): container finished" podID="1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3" containerID="1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567" exitCode=0 Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.885829 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerDied","Data":"1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.913309 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.952047 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.952111 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.952123 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.952146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.952174 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:41Z","lastTransitionTime":"2025-10-10T06:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.958227 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:41 crc kubenswrapper[4822]: I1010 06:24:41.988624 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.025725 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.054658 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.054687 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.054698 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.054712 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.054723 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.069591 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.107164 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.147459 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.158987 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.159064 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.159132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.159152 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.159163 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.188411 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.228727 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.262207 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.262265 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.262280 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.262298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.262313 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.272785 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.308029 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.350979 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.365135 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.365190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.365204 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.365229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.365248 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.387058 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.429448 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.467845 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.467882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.467891 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.467904 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.467915 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.473940 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.505330 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.570821 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.570866 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.570878 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.570893 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.570904 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.680683 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.680727 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.680738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.680755 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.680766 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.790159 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.790215 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.790228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.790244 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.790257 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.892264 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.892301 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.892310 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.892326 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.892337 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.896279 4822 generic.go:334] "Generic (PLEG): container finished" podID="1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3" containerID="9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22" exitCode=0 Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.896410 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerDied","Data":"9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.906721 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.907697 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.907761 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.907785 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.924221 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.939061 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.939527 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.946110 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.957526 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.971665 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.983952 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.996602 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.997531 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.997554 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.997572 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.997584 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:42Z","lastTransitionTime":"2025-10-10T06:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:42 crc kubenswrapper[4822]: I1010 06:24:42.998366 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.014694 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.030578 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.045980 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.057944 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.068639 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.082426 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.100514 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.101090 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.101158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.101182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.101214 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.103128 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.115903 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.136286 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.149536 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.188831 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.204403 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.204463 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.204474 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.204491 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.204504 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.228117 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.272695 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.308419 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.308477 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.308492 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.308512 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.308524 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.309226 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.350392 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.388110 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.416562 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.416629 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.416640 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.416655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.416667 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.425229 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.467470 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.508031 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.519416 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.519461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.519471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.519490 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.519502 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.547297 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.587031 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.622034 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.622083 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.622095 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.622113 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.622122 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.633951 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.650188 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.650201 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:43 crc kubenswrapper[4822]: E1010 06:24:43.650492 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:43 crc kubenswrapper[4822]: E1010 06:24:43.650631 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.650978 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:43 crc kubenswrapper[4822]: E1010 06:24:43.651078 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.667068 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.714958 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.716093 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.716956 4822 scope.go:117] "RemoveContainer" containerID="b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.724043 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.724079 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.724088 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.724102 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.724113 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.746210 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.786571 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.827682 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.827733 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.827742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.827761 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.827774 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.828635 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.867531 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.908335 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.916628 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" event={"ID":"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3","Type":"ContainerStarted","Data":"aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.918939 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.920592 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.920819 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.929754 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.929789 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.929832 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.929849 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.929857 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:43Z","lastTransitionTime":"2025-10-10T06:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.948634 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:43 crc kubenswrapper[4822]: I1010 06:24:43.987362 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.028230 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.032173 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.032244 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.032262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.032287 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.032304 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.068743 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.115652 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.135106 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.135176 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.135190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.135206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.135218 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.153323 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.192354 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.228656 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.237763 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.237833 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.237847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.237867 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.237879 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.278711 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.310293 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.340445 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.340493 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.340508 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.340525 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.340536 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.352649 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.390756 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.426240 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.442735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.442774 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.442783 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.442813 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.442826 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.467588 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.506217 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.547190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.547197 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.547239 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.547368 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.547391 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.547406 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.586899 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.625707 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.649765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.649820 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.649832 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.649847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.649859 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.683817 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.712177 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.750084 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.751583 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.751611 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.751623 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.751639 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.751651 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.786388 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.830637 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.854264 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.854296 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.854309 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.854329 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.854342 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.865812 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.909530 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.957185 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.957237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.957247 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.957277 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:44 crc kubenswrapper[4822]: I1010 06:24:44.957286 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:44Z","lastTransitionTime":"2025-10-10T06:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.060349 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.060383 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.060393 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.060409 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.060418 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.163706 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.164183 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.164200 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.164220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.164230 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.266341 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.266394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.266402 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.266418 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.266428 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.368642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.368687 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.368696 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.368711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.368723 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.471330 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.471421 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.471450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.471481 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.471505 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.573908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.573953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.573965 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.573980 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.573990 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.649864 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.649918 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.649964 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:45 crc kubenswrapper[4822]: E1010 06:24:45.650029 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:45 crc kubenswrapper[4822]: E1010 06:24:45.650133 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:45 crc kubenswrapper[4822]: E1010 06:24:45.650290 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.676002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.676042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.676052 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.676067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.676077 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.779229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.779263 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.779270 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.779283 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.779292 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.881997 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.882032 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.882042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.882074 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.882088 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.929292 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/0.log" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.932170 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af" exitCode=1 Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.932223 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af"} Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.932970 4822 scope.go:117] "RemoveContainer" containerID="392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.948335 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.969593 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:45Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}\\\\nI1010 06:24:45.210431 6107 services_controller.go:445] Built service openshift-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1010 06:24:45.210400 6107 services_controller.go:452] Built service openshift-kube-storage-version-migrator-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1010 06:24:45.210449 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1010 06:24:45.210426 6107 services_controller.go:451] Built service openshift-kube-apiserver/apiserver cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1010 06:24:45.210463 6107 services_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.981846 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.986567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.986627 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.986644 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.986668 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:45 crc kubenswrapper[4822]: I1010 06:24:45.986686 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:45Z","lastTransitionTime":"2025-10-10T06:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.008870 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.025444 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.043176 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.058913 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.076673 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.091657 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.091906 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.091973 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.091992 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.092018 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.092038 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.107673 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.122035 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.133922 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.147653 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.162002 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.175485 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.194897 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.194951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.194962 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.194984 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.194999 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.298076 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.298119 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.298129 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.298149 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.298162 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.400478 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.400534 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.400544 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.400563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.400574 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.502764 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.502838 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.502850 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.502866 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.502877 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.605440 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.605472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.605481 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.605493 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.605501 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.707842 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.707899 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.707912 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.707935 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.707949 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.811206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.811285 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.811300 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.811316 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.811327 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.914234 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.914292 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.914301 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.914314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.914323 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:46Z","lastTransitionTime":"2025-10-10T06:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.938346 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/0.log" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.941123 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815"} Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.941494 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.965436 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.979579 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:46 crc kubenswrapper[4822]: I1010 06:24:46.994700 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.015317 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.016846 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.016896 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.016908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.016926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.016938 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.034620 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.051526 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.068067 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.086496 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.098857 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.110141 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.119362 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.119393 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.119401 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.119413 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.119424 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.123516 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.136320 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.149676 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.169495 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:45Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}\\\\nI1010 06:24:45.210431 6107 services_controller.go:445] Built service openshift-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1010 06:24:45.210400 6107 services_controller.go:452] Built service openshift-kube-storage-version-migrator-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1010 06:24:45.210449 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1010 06:24:45.210426 6107 services_controller.go:451] Built service openshift-kube-apiserver/apiserver cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1010 06:24:45.210463 6107 services_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.182065 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.222022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.222059 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.222091 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.222106 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.222118 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.325578 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.325656 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.325674 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.325694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.325731 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.428212 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.428257 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.428267 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.428281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.428290 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.481968 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm"] Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.482591 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.484784 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.485029 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.495701 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.508756 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.528132 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:45Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}\\\\nI1010 06:24:45.210431 6107 services_controller.go:445] Built service openshift-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1010 06:24:45.210400 6107 services_controller.go:452] Built service openshift-kube-storage-version-migrator-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1010 06:24:45.210449 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1010 06:24:45.210426 6107 services_controller.go:451] Built service openshift-kube-apiserver/apiserver cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1010 06:24:45.210463 6107 services_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.530506 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.530615 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.530641 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.530671 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.530694 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.539965 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6be06d9-ad0f-4110-bba3-962524886f08-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.540036 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6be06d9-ad0f-4110-bba3-962524886f08-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.540069 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnksp\" (UniqueName: \"kubernetes.io/projected/c6be06d9-ad0f-4110-bba3-962524886f08-kube-api-access-dnksp\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.540102 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6be06d9-ad0f-4110-bba3-962524886f08-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.548884 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.564135 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.582950 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.595075 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.609238 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.622693 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.633220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.633338 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.633353 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.633372 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.633409 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.634690 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.641082 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6be06d9-ad0f-4110-bba3-962524886f08-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.641131 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6be06d9-ad0f-4110-bba3-962524886f08-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.641189 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnksp\" (UniqueName: \"kubernetes.io/projected/c6be06d9-ad0f-4110-bba3-962524886f08-kube-api-access-dnksp\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.641287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6be06d9-ad0f-4110-bba3-962524886f08-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.642080 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6be06d9-ad0f-4110-bba3-962524886f08-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.643031 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6be06d9-ad0f-4110-bba3-962524886f08-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.646408 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.647776 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6be06d9-ad0f-4110-bba3-962524886f08-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.651383 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:47 crc kubenswrapper[4822]: E1010 06:24:47.651492 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.652043 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:47 crc kubenswrapper[4822]: E1010 06:24:47.652895 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.656104 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:47 crc kubenswrapper[4822]: E1010 06:24:47.656273 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.658304 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.659678 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnksp\" (UniqueName: \"kubernetes.io/projected/c6be06d9-ad0f-4110-bba3-962524886f08-kube-api-access-dnksp\") pod \"ovnkube-control-plane-749d76644c-9bczm\" (UID: \"c6be06d9-ad0f-4110-bba3-962524886f08\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.674356 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.687871 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.701217 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.714082 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.735564 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.735603 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.735611 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.735625 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.735634 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.795790 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" Oct 10 06:24:47 crc kubenswrapper[4822]: W1010 06:24:47.809997 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6be06d9_ad0f_4110_bba3_962524886f08.slice/crio-df4544632eb9003fc345ec85bbd2cdbc29441792d3f309f2328c5e23ab163d24 WatchSource:0}: Error finding container df4544632eb9003fc345ec85bbd2cdbc29441792d3f309f2328c5e23ab163d24: Status 404 returned error can't find the container with id df4544632eb9003fc345ec85bbd2cdbc29441792d3f309f2328c5e23ab163d24 Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.844070 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.844121 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.844132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.844165 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.844178 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.945838 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.945885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.945926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.945943 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.945954 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:47Z","lastTransitionTime":"2025-10-10T06:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.946899 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" event={"ID":"c6be06d9-ad0f-4110-bba3-962524886f08","Type":"ContainerStarted","Data":"df4544632eb9003fc345ec85bbd2cdbc29441792d3f309f2328c5e23ab163d24"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.948893 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/1.log" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.949679 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/0.log" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.952977 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815" exitCode=1 Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.953046 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815"} Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.953099 4822 scope.go:117] "RemoveContainer" containerID="392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.953639 4822 scope.go:117] "RemoveContainer" containerID="7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815" Oct 10 06:24:47 crc kubenswrapper[4822]: E1010 06:24:47.954019 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.969621 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:47 crc kubenswrapper[4822]: I1010 06:24:47.985550 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.000349 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.015271 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.033539 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.047699 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.049565 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.049614 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.049627 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.049642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.049651 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.063293 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.075596 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.088277 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.104082 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.120875 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.133222 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.145937 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.151781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.151869 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.151898 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.151923 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.151935 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.167028 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:45Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}\\\\nI1010 06:24:45.210431 6107 services_controller.go:445] Built service openshift-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1010 06:24:45.210400 6107 services_controller.go:452] Built service openshift-kube-storage-version-migrator-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1010 06:24:45.210449 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1010 06:24:45.210426 6107 services_controller.go:451] Built service openshift-kube-apiserver/apiserver cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1010 06:24:45.210463 6107 services_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.179066 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.205551 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.254858 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.254903 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.254914 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.254931 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.254943 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.357335 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.357397 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.357408 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.357421 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.357429 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.460226 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.460281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.460296 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.460314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.460326 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.563613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.563743 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.563761 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.563784 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.563836 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.667238 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.667287 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.667297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.667311 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.667322 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.770107 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.770191 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.770206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.770229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.770241 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.872794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.872853 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.872865 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.872881 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.872892 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.959468 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" event={"ID":"c6be06d9-ad0f-4110-bba3-962524886f08","Type":"ContainerStarted","Data":"81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.959523 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" event={"ID":"c6be06d9-ad0f-4110-bba3-962524886f08","Type":"ContainerStarted","Data":"5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.961926 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/1.log" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.967036 4822 scope.go:117] "RemoveContainer" containerID="7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815" Oct 10 06:24:48 crc kubenswrapper[4822]: E1010 06:24:48.967222 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.972745 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-25l92"] Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.973412 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:48 crc kubenswrapper[4822]: E1010 06:24:48.973525 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.974423 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.974450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.974459 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.974472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.974482 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:48Z","lastTransitionTime":"2025-10-10T06:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:48 crc kubenswrapper[4822]: I1010 06:24:48.994747 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392175a93e4299b4ddac663e8d15f1fec882cd91d31405bb12fd9fe8428098af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:45Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}\\\\nI1010 06:24:45.210431 6107 services_controller.go:445] Built service openshift-controller-manager-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1010 06:24:45.210400 6107 services_controller.go:452] Built service openshift-kube-storage-version-migrator-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1010 06:24:45.210449 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1010 06:24:45.210426 6107 services_controller.go:451] Built service openshift-kube-apiserver/apiserver cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1010 06:24:45.210463 6107 services_\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.008711 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.020146 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.039240 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.053685 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.053945 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.054184 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srss6\" (UniqueName: \"kubernetes.io/projected/8a5c431a-2c94-41ca-aba2-c7a04c4908db-kube-api-access-srss6\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.065831 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.076950 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.077010 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.077024 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.077042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.077055 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.080341 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.092047 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.107310 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.120786 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.133715 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.145937 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.155226 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.155534 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srss6\" (UniqueName: \"kubernetes.io/projected/8a5c431a-2c94-41ca-aba2-c7a04c4908db-kube-api-access-srss6\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.155405 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.155783 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:24:49.655765188 +0000 UTC m=+36.750923384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.161782 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.170752 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srss6\" (UniqueName: \"kubernetes.io/projected/8a5c431a-2c94-41ca-aba2-c7a04c4908db-kube-api-access-srss6\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.174997 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.178725 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.178751 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.178759 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.178771 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.178778 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.188512 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.201258 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.213595 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.225021 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.238952 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.253437 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.273079 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.282077 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.282180 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.282219 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.282243 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.282256 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.291693 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.306164 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.318910 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.331787 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.347012 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.357914 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.358115 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:25:05.358085738 +0000 UTC m=+52.453243924 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.366349 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.381117 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.385478 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.385532 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.385548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.385579 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.385593 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.396891 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.408247 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.422717 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.441660 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.459451 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.459504 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.459546 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.459587 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459716 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459745 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459742 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459874 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459759 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459897 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:05.45986318 +0000 UTC m=+52.555021546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.460090 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:05.460069606 +0000 UTC m=+52.555227802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.459716 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.460149 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.460180 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.460104 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:05.460098567 +0000 UTC m=+52.555256763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.460265 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:05.460231881 +0000 UTC m=+52.555390247 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.463478 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.488453 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.488502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.488514 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.488531 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.488544 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.590598 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.590646 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.590659 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.590677 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.590690 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.650159 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.650219 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.650163 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.650338 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.650464 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.650560 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.661337 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.661479 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: E1010 06:24:49.661543 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:24:50.661528182 +0000 UTC m=+37.756686378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.693556 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.693594 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.693602 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.693616 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.693628 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.796309 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.796343 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.796351 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.796368 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.796378 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.898722 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.898766 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.898778 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.898793 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:49 crc kubenswrapper[4822]: I1010 06:24:49.898824 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:49Z","lastTransitionTime":"2025-10-10T06:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.002169 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.002218 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.002231 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.002249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.002264 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.104988 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.105026 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.105038 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.105054 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.105066 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.119743 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.119787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.119817 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.119835 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.119847 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.134554 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.138868 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.138905 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.138914 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.138933 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.138946 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.152427 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.157852 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.157901 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.157932 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.157953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.157967 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.172006 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.176968 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.177020 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.177030 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.177046 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.177056 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.196061 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.199837 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.199886 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.199896 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.199911 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.199923 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.211059 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.211164 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.212974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.213027 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.213039 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.213061 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.213072 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.316038 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.316075 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.316084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.316099 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.316108 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.419254 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.419288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.419298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.419316 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.419327 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.523344 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.523412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.523433 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.523462 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.523481 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.626961 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.627342 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.627522 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.627734 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.628002 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.649476 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.649734 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.673878 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.674110 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:50 crc kubenswrapper[4822]: E1010 06:24:50.674214 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:24:52.674185071 +0000 UTC m=+39.769343307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.731300 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.731895 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.731918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.731935 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.731947 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.835471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.835533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.835551 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.835579 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.835596 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.938776 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.938849 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.938866 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.938887 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:50 crc kubenswrapper[4822]: I1010 06:24:50.938901 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:50Z","lastTransitionTime":"2025-10-10T06:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.041563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.041605 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.041616 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.041633 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.041646 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.144642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.144716 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.144735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.144762 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.144779 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.247508 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.247561 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.247576 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.247597 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.247611 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.350342 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.350500 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.350527 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.350555 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.350576 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.454134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.454198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.454211 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.454233 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.454249 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.557935 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.557988 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.558000 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.558017 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.558028 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.650248 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.650625 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.650542 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:51 crc kubenswrapper[4822]: E1010 06:24:51.650946 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:51 crc kubenswrapper[4822]: E1010 06:24:51.651147 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:51 crc kubenswrapper[4822]: E1010 06:24:51.651319 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.660016 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.660252 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.660336 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.660415 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.660491 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.763628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.764371 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.764541 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.764670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.764774 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.868043 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.868339 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.868423 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.868564 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.868651 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.971790 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.971847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.971857 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.971873 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:51 crc kubenswrapper[4822]: I1010 06:24:51.971885 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:51Z","lastTransitionTime":"2025-10-10T06:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.073976 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.074037 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.074054 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.074081 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.074101 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.176604 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.176951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.177016 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.177450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.177516 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.280720 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.280778 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.280793 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.280848 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.280861 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.383289 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.383329 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.383337 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.383351 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.383361 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.485881 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.485929 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.485941 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.485958 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.485969 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.588446 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.588483 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.588492 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.588506 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.588515 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.649270 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:52 crc kubenswrapper[4822]: E1010 06:24:52.649418 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.690994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.691037 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.691047 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.691062 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.691072 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.698201 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:52 crc kubenswrapper[4822]: E1010 06:24:52.698369 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:52 crc kubenswrapper[4822]: E1010 06:24:52.698469 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:24:56.698447962 +0000 UTC m=+43.793606158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.793888 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.793932 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.793942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.793957 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.793966 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.896648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.896695 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.896704 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.896719 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.896729 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.999198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.999253 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.999262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.999277 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:52 crc kubenswrapper[4822]: I1010 06:24:52.999286 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:52Z","lastTransitionTime":"2025-10-10T06:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.102273 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.102312 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.102320 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.102334 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.102344 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.204702 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.204738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.204747 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.204759 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.204767 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.307620 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.307702 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.307768 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.307840 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.307869 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.410335 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.410383 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.410446 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.410468 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.410479 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.514052 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.514102 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.514113 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.514130 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.514141 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.616446 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.616480 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.616488 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.616502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.616513 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.650087 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.650163 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:53 crc kubenswrapper[4822]: E1010 06:24:53.650245 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.650124 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:53 crc kubenswrapper[4822]: E1010 06:24:53.650355 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:53 crc kubenswrapper[4822]: E1010 06:24:53.650385 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.671281 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.687051 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.699891 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.716973 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.718740 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.718793 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.718826 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.718859 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.718874 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.731693 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.750425 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.765504 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.779104 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.791970 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.804085 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.818620 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.821601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.821641 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.821650 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.821664 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.821674 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.833516 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.846653 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.860481 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.870654 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.882038 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.898747 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.924987 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.925056 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.925067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.925086 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:53 crc kubenswrapper[4822]: I1010 06:24:53.925097 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:53Z","lastTransitionTime":"2025-10-10T06:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.028099 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.028141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.028150 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.028167 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.028178 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.131285 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.131628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.131768 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.132031 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.132252 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.235589 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.235649 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.235668 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.235694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.235713 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.338472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.338529 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.338543 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.338560 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.338571 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.417415 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.438545 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.441238 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.441285 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.441297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.441314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.441325 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.452357 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.466701 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.480673 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.491005 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.500587 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.516727 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.532449 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.543952 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.544002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.544014 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.544032 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.544045 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.550853 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.564753 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.577436 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.589623 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.602148 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.614151 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.629440 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.646664 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.646743 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.646759 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.646778 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.646789 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.648508 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.649573 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:54 crc kubenswrapper[4822]: E1010 06:24:54.649690 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.661519 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:54Z is after 2025-08-24T17:21:41Z" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.749523 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.749554 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.749562 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.749575 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.749585 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.852705 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.852752 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.852765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.852781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.852791 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.955739 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.955787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.955818 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.955835 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:54 crc kubenswrapper[4822]: I1010 06:24:54.955849 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:54Z","lastTransitionTime":"2025-10-10T06:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.058723 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.058846 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.058885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.058915 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.058933 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.161692 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.161740 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.161754 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.161776 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.161791 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.265717 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.265842 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.265858 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.265877 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.265894 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.368253 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.368300 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.368314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.368331 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.368344 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.471250 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.471286 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.471294 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.471307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.471314 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.573700 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.573731 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.573741 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.573757 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.573768 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.649873 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.649915 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:55 crc kubenswrapper[4822]: E1010 06:24:55.649996 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.649873 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:55 crc kubenswrapper[4822]: E1010 06:24:55.650106 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:55 crc kubenswrapper[4822]: E1010 06:24:55.650200 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.676082 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.676129 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.676141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.676158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.676170 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.779567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.779625 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.779638 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.779656 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.779669 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.882201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.882249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.882260 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.882276 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.882287 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.984359 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.984430 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.984442 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.984458 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:55 crc kubenswrapper[4822]: I1010 06:24:55.984470 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:55Z","lastTransitionTime":"2025-10-10T06:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.087182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.087249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.087261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.087297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.087311 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.189549 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.189943 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.189954 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.189968 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.189980 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.292936 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.292992 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.293012 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.293030 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.293042 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.395553 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.395589 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.395600 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.395613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.395621 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.499033 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.499296 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.499354 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.499418 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.499474 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.602156 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.602192 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.602201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.602218 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.602227 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.650082 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:56 crc kubenswrapper[4822]: E1010 06:24:56.650242 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.705697 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.705741 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.705752 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.705769 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.705780 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.741577 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:56 crc kubenswrapper[4822]: E1010 06:24:56.741738 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:56 crc kubenswrapper[4822]: E1010 06:24:56.741834 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:25:04.741784754 +0000 UTC m=+51.836942980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.808431 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.808506 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.808525 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.808548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.808563 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.910769 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.910869 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.910885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.910903 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:56 crc kubenswrapper[4822]: I1010 06:24:56.910917 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:56Z","lastTransitionTime":"2025-10-10T06:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.013077 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.013122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.013133 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.013148 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.013161 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.116112 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.116148 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.116157 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.116170 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.116179 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.218437 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.218472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.218482 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.218495 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.218505 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.321447 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.321494 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.321502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.321517 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.321526 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.423865 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.423925 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.423944 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.423966 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.423981 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.527008 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.527056 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.527067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.527081 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.527089 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.629012 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.629052 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.629063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.629078 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.629088 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.650660 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.650688 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:57 crc kubenswrapper[4822]: E1010 06:24:57.650879 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:57 crc kubenswrapper[4822]: E1010 06:24:57.651009 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.650692 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:57 crc kubenswrapper[4822]: E1010 06:24:57.651095 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.732328 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.732369 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.732378 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.732392 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.732403 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.835525 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.835573 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.835582 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.835595 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.835606 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.939181 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.939237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.939254 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.939277 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:57 crc kubenswrapper[4822]: I1010 06:24:57.939295 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:57Z","lastTransitionTime":"2025-10-10T06:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.041561 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.041613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.041631 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.041658 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.041676 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.144857 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.144945 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.144975 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.145021 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.145050 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.248442 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.248483 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.248502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.248522 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.248539 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.352194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.352251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.352263 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.352282 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.352294 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.455068 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.455118 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.455130 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.455149 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.455160 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.557838 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.557895 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.557905 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.557922 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.557934 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.649848 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:24:58 crc kubenswrapper[4822]: E1010 06:24:58.649991 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.661022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.661063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.661076 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.661091 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.661102 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.763492 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.763537 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.763547 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.763563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.763574 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.866757 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.866826 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.866844 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.866860 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.866873 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.970046 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.970107 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.970122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.970145 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:58 crc kubenswrapper[4822]: I1010 06:24:58.970163 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:58Z","lastTransitionTime":"2025-10-10T06:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.072441 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.072488 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.072499 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.072516 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.072529 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.174934 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.174996 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.175007 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.175024 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.175036 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.278422 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.278490 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.278532 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.278567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.278590 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.381613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.381676 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.381692 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.381716 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.381733 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.484185 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.484213 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.484223 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.484237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.484263 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.587383 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.587413 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.587422 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.587434 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.587442 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.649598 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:24:59 crc kubenswrapper[4822]: E1010 06:24:59.649737 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.650055 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:24:59 crc kubenswrapper[4822]: E1010 06:24:59.650272 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.650391 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:24:59 crc kubenswrapper[4822]: E1010 06:24:59.650555 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.690003 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.690079 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.690104 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.690129 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.690151 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.794688 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.795198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.795236 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.795270 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.795296 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.897882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.897938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.897953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.897970 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:24:59 crc kubenswrapper[4822]: I1010 06:24:59.897982 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:24:59Z","lastTransitionTime":"2025-10-10T06:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.000249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.000306 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.000327 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.000354 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.000377 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.103454 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.103533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.103555 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.103601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.103623 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.207352 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.207426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.207444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.207472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.207489 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.310125 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.310155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.310163 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.310175 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.310184 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.364382 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.364420 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.364437 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.364457 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.364469 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.381662 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:00Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.386076 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.386187 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.386246 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.386281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.386307 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.398735 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:00Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.402157 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.402213 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.402224 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.402242 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.402257 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.413761 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:00Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.417031 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.417063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.417071 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.417084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.417093 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.428090 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:00Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.431907 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.431943 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.431951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.431966 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.431976 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.450273 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:00Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.450397 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.452195 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.452232 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.452245 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.452262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.452274 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.555127 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.555169 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.555180 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.555194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.555204 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.649340 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:00 crc kubenswrapper[4822]: E1010 06:25:00.649579 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.650657 4822 scope.go:117] "RemoveContainer" containerID="7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.659756 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.659834 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.659854 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.659879 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.659896 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.762735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.762783 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.762792 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.762828 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.762840 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.864973 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.864997 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.865006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.865020 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.865028 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.967471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.967511 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.967525 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.967543 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:00 crc kubenswrapper[4822]: I1010 06:25:00.967553 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:00Z","lastTransitionTime":"2025-10-10T06:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.006527 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/1.log" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.008748 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.009219 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.027753 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.041545 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.058140 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.069757 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.069829 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.069839 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.069853 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.069862 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.073497 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.089069 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.103434 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.127849 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.144149 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.158388 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.172347 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.172396 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.172404 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.172419 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.172428 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.178245 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.188832 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.210390 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.222738 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.236764 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.256661 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.268903 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.275137 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.275176 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.275185 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.275199 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.275208 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.281860 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.377281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.377320 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.377331 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.377346 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.377355 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.479564 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.479608 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.479617 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.479631 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.479639 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.582754 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.582821 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.582835 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.582854 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.582872 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.650062 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.650198 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:01 crc kubenswrapper[4822]: E1010 06:25:01.650364 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.650385 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:01 crc kubenswrapper[4822]: E1010 06:25:01.650527 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:01 crc kubenswrapper[4822]: E1010 06:25:01.650733 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.685306 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.685356 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.685365 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.685379 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.685389 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.789424 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.789477 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.789497 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.789570 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.789592 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.892979 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.893022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.893033 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.893050 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.893064 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.995124 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.995154 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.995162 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.995176 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:01 crc kubenswrapper[4822]: I1010 06:25:01.995186 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:01Z","lastTransitionTime":"2025-10-10T06:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.013540 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/2.log" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.014115 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/1.log" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.016851 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731" exitCode=1 Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.016899 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.016944 4822 scope.go:117] "RemoveContainer" containerID="7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.017559 4822 scope.go:117] "RemoveContainer" containerID="4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731" Oct 10 06:25:02 crc kubenswrapper[4822]: E1010 06:25:02.017851 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.040558 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.057266 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.070786 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.084035 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.097454 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.097486 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.097503 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.097517 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.097526 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.102174 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.116036 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.128112 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.138773 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.152740 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.170740 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c3680e293a5ba999fb2c5bac20269da570e6c862e1eadd5e83c7a68a14ec815\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:24:46Z\\\",\\\"message\\\":\\\"eject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:24:46.727190 6259 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1010 06:24:46.727194 6259 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF1010 06:24:46.728591 6259 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:24:46Z is after\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.180693 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.200228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.200288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.200299 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.200318 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.200331 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.200567 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.213244 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.224378 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.238675 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.249528 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.261916 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:02Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.302938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.302993 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.303005 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.303023 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.303036 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.405635 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.405672 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.405684 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.405701 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.405711 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.507710 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.507767 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.507783 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.507840 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.507890 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.610929 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.610955 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.610963 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.610976 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.610984 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.649987 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:02 crc kubenswrapper[4822]: E1010 06:25:02.650179 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.714036 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.714084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.714092 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.714106 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.714116 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.816352 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.816392 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.816403 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.816420 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.816434 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.919288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.919329 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.919337 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.919353 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:02 crc kubenswrapper[4822]: I1010 06:25:02.919365 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:02Z","lastTransitionTime":"2025-10-10T06:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.021374 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.021418 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.021430 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.021445 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.021456 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.022550 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/2.log" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.026957 4822 scope.go:117] "RemoveContainer" containerID="4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731" Oct 10 06:25:03 crc kubenswrapper[4822]: E1010 06:25:03.027252 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.056963 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.078507 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.098333 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.117791 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.127735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.127798 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.127856 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.127885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.127903 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.134886 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.145373 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.157516 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.171329 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.184340 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.195448 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.207329 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.220018 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.230542 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.230570 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.230579 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.230593 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.230602 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.231671 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.244468 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.257679 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.275666 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.286501 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.333274 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.333304 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.333333 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.333348 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.333358 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.436175 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.436220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.436228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.436243 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.436252 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.539196 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.539261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.539278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.539301 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.539318 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.642626 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.642711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.642722 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.642740 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.642751 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.649962 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.650082 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.650211 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:03 crc kubenswrapper[4822]: E1010 06:25:03.650178 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:03 crc kubenswrapper[4822]: E1010 06:25:03.650333 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:03 crc kubenswrapper[4822]: E1010 06:25:03.650559 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.670163 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.691595 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.706942 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.737109 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.749872 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.749938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.749960 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.749990 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.750010 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.757740 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.774840 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.788395 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.797051 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.811018 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.823430 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.833498 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.843538 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.852129 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.852172 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.852180 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.852194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.852202 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.854239 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.864398 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.882539 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.892111 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.910421 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.954972 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.955015 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.955025 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.955041 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:03 crc kubenswrapper[4822]: I1010 06:25:03.955054 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:03Z","lastTransitionTime":"2025-10-10T06:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.057278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.057324 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.057333 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.057347 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.057356 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.160328 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.160381 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.160391 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.160411 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.160421 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.263678 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.263721 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.263731 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.263748 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.263758 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.367163 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.367220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.367238 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.367262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.367278 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.469421 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.469476 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.469489 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.469510 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.469523 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.572044 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.572088 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.572099 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.572114 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.572128 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.649922 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:04 crc kubenswrapper[4822]: E1010 06:25:04.650076 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.674479 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.674535 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.674548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.674573 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.674588 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.777154 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.777192 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.777200 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.777214 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.777225 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.826736 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:04 crc kubenswrapper[4822]: E1010 06:25:04.826899 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:25:04 crc kubenswrapper[4822]: E1010 06:25:04.826958 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:25:20.826942258 +0000 UTC m=+67.922100454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.879356 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.879399 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.879412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.879427 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.879436 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.981174 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.981234 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.981246 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.981268 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:04 crc kubenswrapper[4822]: I1010 06:25:04.981281 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:04Z","lastTransitionTime":"2025-10-10T06:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.084232 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.084299 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.084325 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.084354 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.084375 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.187885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.187952 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.187973 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.188003 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.188027 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.291964 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.292008 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.292017 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.292030 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.292039 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.395438 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.395484 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.395494 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.395509 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.395523 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.432632 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.432862 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:25:37.432832467 +0000 UTC m=+84.527990663 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.498695 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.498754 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.498763 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.498778 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.499186 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.533956 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.534066 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.534100 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.534127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534122 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534227 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:37.534212669 +0000 UTC m=+84.629370865 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534230 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534270 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534285 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534377 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:37.534357573 +0000 UTC m=+84.629515849 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534423 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534450 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:37.534442015 +0000 UTC m=+84.629600211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534505 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534516 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534526 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.534547 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:25:37.534541738 +0000 UTC m=+84.629699934 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.601939 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.602010 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.602033 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.602061 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.602082 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.649918 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.649980 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.649923 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.650126 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.650219 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:05 crc kubenswrapper[4822]: E1010 06:25:05.650447 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.705791 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.705856 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.705864 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.705878 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.705887 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.811145 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.811241 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.811410 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.811561 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.811605 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.913868 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.913906 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.913918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.913958 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:05 crc kubenswrapper[4822]: I1010 06:25:05.913973 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:05Z","lastTransitionTime":"2025-10-10T06:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.016030 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.016105 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.016128 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.016156 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.016176 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.118995 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.119035 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.119045 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.119060 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.119070 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.222252 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.222631 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.222641 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.222656 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.222665 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.325377 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.325436 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.325446 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.325468 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.325482 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.428409 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.428448 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.428458 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.428471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.428480 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.531118 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.531170 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.531182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.531202 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.531214 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.633737 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.633811 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.633840 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.633859 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.633871 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.649330 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:06 crc kubenswrapper[4822]: E1010 06:25:06.649512 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.735792 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.735862 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.735885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.735899 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.735907 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.838198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.838245 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.838256 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.838272 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.838282 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.942032 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.942094 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.942106 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.942126 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:06 crc kubenswrapper[4822]: I1010 06:25:06.942138 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:06Z","lastTransitionTime":"2025-10-10T06:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.044532 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.044613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.044640 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.044671 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.044698 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.147737 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.147785 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.147794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.147836 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.147854 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.251024 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.251070 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.251080 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.251098 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.251110 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.353529 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.353573 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.353585 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.353598 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.353607 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.455625 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.455711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.455734 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.455767 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.455789 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.558603 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.558691 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.558705 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.558751 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.558771 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.650092 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.650193 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.650126 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:07 crc kubenswrapper[4822]: E1010 06:25:07.650282 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:07 crc kubenswrapper[4822]: E1010 06:25:07.650449 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:07 crc kubenswrapper[4822]: E1010 06:25:07.650627 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.662357 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.662434 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.662476 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.662509 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.662534 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.766055 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.766109 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.766120 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.766136 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.766151 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.884100 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.884155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.884171 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.884196 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.884213 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.987395 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.987439 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.987448 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.987463 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:07 crc kubenswrapper[4822]: I1010 06:25:07.987471 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:07Z","lastTransitionTime":"2025-10-10T06:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.089913 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.089960 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.089971 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.089987 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.089998 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.192780 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.192882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.192892 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.192906 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.192916 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.295770 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.295879 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.295891 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.295906 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.295915 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.398112 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.398153 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.398165 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.398179 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.398191 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.500978 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.501017 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.501045 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.501061 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.501069 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.603711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.603765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.603777 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.603792 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.603816 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.650309 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:08 crc kubenswrapper[4822]: E1010 06:25:08.650474 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.705548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.705589 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.705601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.705617 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.705629 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.808584 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.808643 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.808654 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.808671 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.808681 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.911453 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.911495 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.911504 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.911520 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:08 crc kubenswrapper[4822]: I1010 06:25:08.911531 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:08Z","lastTransitionTime":"2025-10-10T06:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.014392 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.014446 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.014460 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.014479 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.014492 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.116728 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.116844 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.116871 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.116900 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.116919 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.219268 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.219317 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.219328 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.219344 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.219355 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.322285 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.322343 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.322360 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.322382 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.322400 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.425200 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.425250 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.425261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.425281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.425294 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.528021 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.528073 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.528084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.528098 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.528107 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.592243 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.602121 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.616681 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.630396 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.630428 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.630439 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.630454 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.630465 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.630520 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.644703 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.649295 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.649387 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.649427 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:09 crc kubenswrapper[4822]: E1010 06:25:09.649484 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:09 crc kubenswrapper[4822]: E1010 06:25:09.649575 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:09 crc kubenswrapper[4822]: E1010 06:25:09.649688 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.664073 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.683285 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.695258 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.708480 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.720256 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.731434 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.733943 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.733994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.734005 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.734029 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.734044 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.742700 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.754341 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.766699 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.776018 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.785590 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.797918 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.825274 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.836549 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.836586 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.836596 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.836612 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.836621 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.837085 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.939096 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.939132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.939143 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.939158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:09 crc kubenswrapper[4822]: I1010 06:25:09.939168 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:09Z","lastTransitionTime":"2025-10-10T06:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.041714 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.041767 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.041782 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.041825 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.041841 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.144831 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.144899 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.144912 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.144959 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.144977 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.246707 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.247340 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.247439 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.247519 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.247792 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.350614 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.350891 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.350961 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.351050 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.351110 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.453867 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.453917 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.453931 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.453949 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.453961 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.556765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.556806 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.556815 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.556852 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.556860 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.584946 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.585002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.585013 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.585027 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.585036 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.600508 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:10Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.606206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.606577 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.606707 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.606791 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.606898 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.624071 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:10Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.628821 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.628863 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.628872 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.628886 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.628895 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.644042 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:10Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.647551 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.647787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.647799 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.647814 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.647845 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.649380 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.649502 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.661648 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:10Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.665282 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.665338 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.665370 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.665388 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.665397 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.680881 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:10Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:10 crc kubenswrapper[4822]: E1010 06:25:10.680990 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.683198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.683228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.683237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.683251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.683263 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.786026 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.786092 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.786110 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.786134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.786150 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.889591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.889667 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.889691 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.889722 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.889745 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.993445 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.993558 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.993623 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.993657 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:10 crc kubenswrapper[4822]: I1010 06:25:10.993680 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:10Z","lastTransitionTime":"2025-10-10T06:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.096513 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.096580 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.096592 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.096613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.096626 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.199714 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.199781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.199797 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.199842 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.199860 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.302411 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.302459 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.302470 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.302488 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.302500 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.405541 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.405585 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.405593 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.405607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.405616 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.508219 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.508253 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.508261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.508274 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.508284 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.612007 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.612063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.612078 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.612097 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.612114 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.649603 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.649659 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.649659 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:11 crc kubenswrapper[4822]: E1010 06:25:11.649795 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:11 crc kubenswrapper[4822]: E1010 06:25:11.649959 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:11 crc kubenswrapper[4822]: E1010 06:25:11.650055 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.713926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.713994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.714006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.714022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.714033 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.817288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.817363 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.817377 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.817394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.817406 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.920117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.920170 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.920187 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.920208 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:11 crc kubenswrapper[4822]: I1010 06:25:11.920224 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:11Z","lastTransitionTime":"2025-10-10T06:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.023228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.023264 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.023271 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.023284 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.023293 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.125503 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.125544 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.125552 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.125568 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.125578 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.227874 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.227964 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.227986 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.228056 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.228079 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.330591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.330648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.330660 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.330678 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.330692 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.433483 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.433543 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.433555 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.433572 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.433584 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.536460 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.536508 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.536520 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.536535 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.536547 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.638918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.638963 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.638974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.638991 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.639002 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.649468 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:12 crc kubenswrapper[4822]: E1010 06:25:12.649609 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.741894 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.741929 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.741937 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.741949 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.741958 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.844370 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.844417 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.844426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.844443 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.844462 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.948145 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.948197 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.948214 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.948237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:12 crc kubenswrapper[4822]: I1010 06:25:12.948253 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:12Z","lastTransitionTime":"2025-10-10T06:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.050979 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.051186 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.051202 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.051216 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.051225 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.153670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.153781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.153847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.153884 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.153907 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.256573 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.256737 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.256751 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.256922 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.256935 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.359601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.359663 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.359678 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.359703 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.359718 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.461906 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.461962 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.461974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.462002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.462023 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.565136 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.565176 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.565185 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.565199 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.565209 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.649899 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:13 crc kubenswrapper[4822]: E1010 06:25:13.650014 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.650079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:13 crc kubenswrapper[4822]: E1010 06:25:13.650231 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.650359 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:13 crc kubenswrapper[4822]: E1010 06:25:13.650429 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.662683 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.670189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.670251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.670264 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.670285 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.670301 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.674363 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.688381 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.701291 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.717206 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.731897 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.748855 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.759935 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.771333 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.775040 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.775067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.775075 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.775088 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.775096 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.786935 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.801957 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.815599 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.827784 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.839988 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.851799 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.864348 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.877369 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.877632 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.877707 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.877878 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.877980 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.882223 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.902948 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.980495 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.980533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.980543 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.980559 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:13 crc kubenswrapper[4822]: I1010 06:25:13.980571 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:13Z","lastTransitionTime":"2025-10-10T06:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.083131 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.083199 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.083222 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.083250 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.083271 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.187590 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.187690 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.187707 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.187726 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.187740 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.290797 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.291220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.291319 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.291406 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.291469 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.393908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.393948 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.393959 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.393975 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.393988 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.497133 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.497191 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.497209 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.497226 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.497237 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.599836 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.600277 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.600288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.600305 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.600318 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.649228 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:14 crc kubenswrapper[4822]: E1010 06:25:14.649376 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.702748 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.702837 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.702850 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.702866 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.702878 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.805511 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.805567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.805587 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.805607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.805618 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.909298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.909357 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.909371 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.909392 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:14 crc kubenswrapper[4822]: I1010 06:25:14.909409 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:14Z","lastTransitionTime":"2025-10-10T06:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.011747 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.011798 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.011827 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.011845 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.011865 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.114371 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.114433 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.114444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.114459 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.114468 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.217380 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.217461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.217476 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.217496 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.217507 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.319646 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.319704 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.319713 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.319727 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.319739 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.422348 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.422389 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.422399 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.422414 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.422424 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.525233 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.525292 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.525305 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.525321 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.525346 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.628361 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.628416 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.628426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.628438 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.628448 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.650281 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:15 crc kubenswrapper[4822]: E1010 06:25:15.650432 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.650449 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.650490 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:15 crc kubenswrapper[4822]: E1010 06:25:15.650554 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:15 crc kubenswrapper[4822]: E1010 06:25:15.650609 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.731161 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.731189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.731196 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.731208 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.731219 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.833510 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.833554 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.833564 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.833588 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.833600 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.935663 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.935713 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.935726 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.935747 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:15 crc kubenswrapper[4822]: I1010 06:25:15.935759 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:15Z","lastTransitionTime":"2025-10-10T06:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.039559 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.039613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.039623 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.039642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.039660 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.142190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.142228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.142238 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.142254 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.142264 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.244907 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.244942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.244951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.244968 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.244982 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.348558 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.348777 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.348794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.348846 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.348859 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.463156 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.463380 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.463410 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.463834 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.463849 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.566261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.566305 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.566316 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.566335 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.566346 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.649356 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:16 crc kubenswrapper[4822]: E1010 06:25:16.649566 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.670275 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.670577 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.670648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.670720 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.670826 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.773113 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.773484 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.773650 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.773778 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.773943 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.877320 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.877368 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.877379 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.877396 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.877409 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.980749 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.981411 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.981669 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.981928 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:16 crc kubenswrapper[4822]: I1010 06:25:16.982147 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:16Z","lastTransitionTime":"2025-10-10T06:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.085143 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.085555 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.085650 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.085725 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.085789 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.187521 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.187567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.187576 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.187591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.187602 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.290962 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.290997 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.291006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.291023 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.291034 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.394633 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.394700 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.394711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.394732 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.394749 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.497457 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.498002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.498110 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.498205 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.498311 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.600985 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.601276 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.601372 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.601468 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.601561 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.650206 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.650278 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.650564 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:17 crc kubenswrapper[4822]: E1010 06:25:17.650568 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:17 crc kubenswrapper[4822]: E1010 06:25:17.650672 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:17 crc kubenswrapper[4822]: E1010 06:25:17.650703 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.651350 4822 scope.go:117] "RemoveContainer" containerID="4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731" Oct 10 06:25:17 crc kubenswrapper[4822]: E1010 06:25:17.651685 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.704916 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.704958 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.704969 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.704984 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.704995 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.808077 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.808133 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.808144 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.808166 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.808176 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.910660 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.910714 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.910728 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.910744 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:17 crc kubenswrapper[4822]: I1010 06:25:17.910756 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:17Z","lastTransitionTime":"2025-10-10T06:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.013847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.013880 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.013891 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.013907 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.013918 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.116938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.116975 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.116987 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.117005 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.117017 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.220182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.220219 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.220228 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.220241 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.220252 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.322881 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.322934 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.322949 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.322965 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.322977 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.425697 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.425746 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.425758 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.425777 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.425789 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.528605 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.528665 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.528675 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.528693 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.528708 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.632377 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.632443 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.632455 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.632474 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.632487 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.650054 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:18 crc kubenswrapper[4822]: E1010 06:25:18.650247 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.735394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.735470 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.735488 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.735511 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.735548 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.838330 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.838389 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.838403 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.838423 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.838435 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.940648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.940705 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.940717 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.940735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:18 crc kubenswrapper[4822]: I1010 06:25:18.940748 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:18Z","lastTransitionTime":"2025-10-10T06:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.043517 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.043610 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.043624 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.043644 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.043679 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.146017 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.146063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.146083 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.146146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.146158 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.249080 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.249131 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.249144 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.249161 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.249172 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.351528 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.351570 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.351580 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.351594 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.351602 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.455080 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.455155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.455170 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.455193 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.455209 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.558515 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.558570 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.558610 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.558628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.558640 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.649701 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.649782 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:19 crc kubenswrapper[4822]: E1010 06:25:19.649852 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.649788 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:19 crc kubenswrapper[4822]: E1010 06:25:19.649969 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:19 crc kubenswrapper[4822]: E1010 06:25:19.650084 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.660931 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.660981 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.660992 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.661006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.661016 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.764158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.764240 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.764263 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.764295 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.764318 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.867390 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.867433 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.867444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.867458 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.867468 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.969816 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.969858 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.969874 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.969889 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:19 crc kubenswrapper[4822]: I1010 06:25:19.969900 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:19Z","lastTransitionTime":"2025-10-10T06:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.073051 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.073104 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.073116 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.073136 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.073149 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.175343 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.175398 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.175408 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.175426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.175437 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.277620 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.277661 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.277670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.277685 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.277693 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.380926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.380974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.380985 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.381001 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.381011 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.483245 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.483299 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.483311 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.483333 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.483347 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.586215 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.586251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.586262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.586276 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.586287 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.650152 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:20 crc kubenswrapper[4822]: E1010 06:25:20.650287 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.689547 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.689610 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.689628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.689661 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.689680 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.792188 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.792251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.792266 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.792286 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.792300 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.894526 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:20 crc kubenswrapper[4822]: E1010 06:25:20.894775 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:25:20 crc kubenswrapper[4822]: E1010 06:25:20.894923 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:25:52.894895776 +0000 UTC m=+99.990053972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.895450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.895502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.895514 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.895530 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.895545 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.997552 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.997631 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.997648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.997665 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:20 crc kubenswrapper[4822]: I1010 06:25:20.997676 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:20Z","lastTransitionTime":"2025-10-10T06:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.080558 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.080616 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.080628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.080648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.080669 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.094662 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:21Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.098280 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.098308 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.098316 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.098328 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.098337 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.108201 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:21Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.112190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.112212 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.112220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.112231 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.112240 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.124733 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:21Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.128483 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.128518 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.128530 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.128547 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.128558 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.140836 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:21Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.144046 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.144072 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.144080 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.144092 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.144101 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.155387 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:21Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.155501 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.157190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.157244 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.157257 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.157277 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.157297 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.259786 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.259852 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.259863 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.259882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.259894 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.362737 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.362789 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.362817 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.362831 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.362840 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.465397 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.465450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.465462 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.465481 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.465493 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.567883 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.568372 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.568494 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.568598 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.568732 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.650315 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.650369 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.650374 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.651316 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.651389 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:21 crc kubenswrapper[4822]: E1010 06:25:21.651361 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.671021 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.671066 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.671075 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.671092 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.671105 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.773774 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.773834 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.773846 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.773862 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.773874 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.876510 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.876562 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.876571 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.876586 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.876595 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.979936 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.979997 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.980006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.980022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:21 crc kubenswrapper[4822]: I1010 06:25:21.980031 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:21Z","lastTransitionTime":"2025-10-10T06:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.082150 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.082192 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.082204 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.082219 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.082230 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.185141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.185193 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.185206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.185224 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.185237 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.288325 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.288358 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.288369 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.288383 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.288394 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.390953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.391189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.391281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.391370 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.391532 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.494145 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.494187 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.494198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.494215 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.494227 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.596563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.596594 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.596603 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.596616 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.596624 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.649789 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:22 crc kubenswrapper[4822]: E1010 06:25:22.650191 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.699516 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.700142 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.700215 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.700298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.700435 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.803220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.803273 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.803281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.803296 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.803307 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.906328 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.906385 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.906400 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.906421 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:22 crc kubenswrapper[4822]: I1010 06:25:22.906453 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:22Z","lastTransitionTime":"2025-10-10T06:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.009852 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.009895 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.009905 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.009925 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.009935 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.093540 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/0.log" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.093948 4822 generic.go:334] "Generic (PLEG): container finished" podID="ec9c77cf-dd02-4e39-b204-9f6540406973" containerID="9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b" exitCode=1 Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.094040 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerDied","Data":"9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.094744 4822 scope.go:117] "RemoveContainer" containerID="9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.112180 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.112946 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.113002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.113014 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.113032 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.113046 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.132793 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.151796 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.165296 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.179248 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.192072 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.202726 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.212163 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.215415 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.215530 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.215588 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.215655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.215720 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.225172 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.241303 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.263403 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.275209 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.294024 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.308102 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.318146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.318201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.318215 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.318233 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.318247 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.321060 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.336994 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.349350 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.360960 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.420978 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.421016 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.421026 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.421042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.421053 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.522950 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.522985 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.522994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.523007 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.523015 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.625942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.625994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.626006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.626025 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.626036 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.649697 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.649754 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:23 crc kubenswrapper[4822]: E1010 06:25:23.649868 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:23 crc kubenswrapper[4822]: E1010 06:25:23.649972 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.650037 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:23 crc kubenswrapper[4822]: E1010 06:25:23.650107 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.664539 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.685457 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.698171 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.716740 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.730329 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.730381 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.730395 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.730416 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.730432 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.733392 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.754364 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.772059 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.785716 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.798139 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.813133 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.829616 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.833329 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.833361 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.833371 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.833389 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.833400 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.843482 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.859746 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.874122 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.890061 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.901687 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.912429 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.923789 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.936404 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.936428 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.936435 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.936449 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:23 crc kubenswrapper[4822]: I1010 06:25:23.936458 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:23Z","lastTransitionTime":"2025-10-10T06:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.038945 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.039008 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.039019 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.039037 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.039054 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.105017 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/0.log" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.105075 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerStarted","Data":"29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.128109 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.141859 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.141910 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.141926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.141948 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.141964 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.145411 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.160103 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.176632 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.188188 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.200764 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.218509 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.233142 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.244672 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.244712 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.244723 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.244742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.244758 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.247554 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.262582 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.277448 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.291538 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.305689 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.321870 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.333027 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.345673 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.346988 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.347045 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.347059 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.347093 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.347113 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.370685 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.382599 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:24Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.449721 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.449757 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.449767 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.449782 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.449791 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.553112 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.553159 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.553177 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.553201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.553213 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.649600 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:24 crc kubenswrapper[4822]: E1010 06:25:24.649755 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.655670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.655742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.655761 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.655783 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.655802 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.759132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.759194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.759204 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.759222 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.759234 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.861738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.861784 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.861794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.861828 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.861839 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.964557 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.964622 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.964639 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.965083 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:24 crc kubenswrapper[4822]: I1010 06:25:24.965123 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:24Z","lastTransitionTime":"2025-10-10T06:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.068134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.068179 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.068190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.068205 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.068214 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.171046 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.171120 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.171137 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.171167 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.171183 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.274181 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.274243 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.274256 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.274278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.274292 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.376500 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.376529 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.376536 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.376549 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.376557 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.478685 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.478716 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.478726 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.478739 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.478748 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.580661 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.580711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.580720 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.580733 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.580743 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.649660 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.649720 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:25 crc kubenswrapper[4822]: E1010 06:25:25.649823 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.649898 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:25 crc kubenswrapper[4822]: E1010 06:25:25.650045 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:25 crc kubenswrapper[4822]: E1010 06:25:25.650140 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.682926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.682974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.682985 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.683001 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.683014 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.785664 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.785718 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.785730 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.785747 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.785761 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.889661 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.889727 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.889740 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.889761 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.889773 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.993029 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.993088 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.993100 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.993121 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:25 crc kubenswrapper[4822]: I1010 06:25:25.993139 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:25Z","lastTransitionTime":"2025-10-10T06:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.095953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.095999 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.096008 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.096025 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.096036 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.198212 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.198252 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.198263 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.198281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.198295 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.300703 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.300781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.300794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.300835 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.300850 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.403293 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.403340 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.403348 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.403363 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.403373 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.506648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.506690 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.506700 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.506715 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.506725 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.609062 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.609129 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.609147 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.609173 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.609192 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.649817 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:26 crc kubenswrapper[4822]: E1010 06:25:26.649944 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.712258 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.712292 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.712300 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.712314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.712324 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.815783 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.815850 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.815864 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.815882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.815896 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.918417 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.918450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.918458 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.918471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:26 crc kubenswrapper[4822]: I1010 06:25:26.918482 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:26Z","lastTransitionTime":"2025-10-10T06:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.021093 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.021135 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.021146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.021160 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.021169 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.123626 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.123681 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.123698 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.123724 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.123741 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.227108 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.227157 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.227170 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.227189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.227201 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.329893 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.329938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.329951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.329967 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.329979 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.433087 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.433140 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.433153 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.433171 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.433183 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.536565 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.536615 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.536625 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.536640 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.536655 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.639535 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.639595 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.639607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.639628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.639642 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.652484 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:27 crc kubenswrapper[4822]: E1010 06:25:27.652617 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.652794 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:27 crc kubenswrapper[4822]: E1010 06:25:27.652861 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.652974 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:27 crc kubenswrapper[4822]: E1010 06:25:27.653024 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.742412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.742451 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.742460 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.742480 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.742492 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.845940 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.845993 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.846014 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.846043 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.846065 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.948490 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.948520 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.948527 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.948540 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:27 crc kubenswrapper[4822]: I1010 06:25:27.948550 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:27Z","lastTransitionTime":"2025-10-10T06:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.051862 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.051899 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.051910 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.051929 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.051942 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.154520 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.154571 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.154579 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.154596 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.154606 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.256865 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.256916 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.256933 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.256949 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.256960 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.360627 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.360696 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.360730 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.360759 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.360781 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.463309 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.463383 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.463405 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.463429 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.463447 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.565938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.565988 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.565997 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.566013 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.566024 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.650280 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:28 crc kubenswrapper[4822]: E1010 06:25:28.650860 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.669596 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.669679 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.669703 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.669730 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.669753 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.772725 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.772787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.772836 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.772864 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.772884 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.875414 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.875467 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.875476 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.875492 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.875503 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.977755 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.977793 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.977826 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.977844 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:28 crc kubenswrapper[4822]: I1010 06:25:28.977855 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:28Z","lastTransitionTime":"2025-10-10T06:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.081278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.081342 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.081361 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.081384 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.081401 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.185203 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.185258 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.185278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.185301 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.185319 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.287890 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.287916 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.287924 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.287938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.287948 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.390278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.390344 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.390362 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.390386 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.390403 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.492755 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.492826 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.492836 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.492849 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.492859 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.595609 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.595649 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.595657 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.595671 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.595680 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.649200 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:29 crc kubenswrapper[4822]: E1010 06:25:29.649346 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.649603 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:29 crc kubenswrapper[4822]: E1010 06:25:29.649700 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.650001 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:29 crc kubenswrapper[4822]: E1010 06:25:29.650112 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.698523 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.698567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.698581 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.698601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.698615 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.801976 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.802057 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.802085 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.802117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.802141 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.909184 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.909257 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.909270 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.909289 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:29 crc kubenswrapper[4822]: I1010 06:25:29.909301 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:29Z","lastTransitionTime":"2025-10-10T06:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.011918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.012266 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.012407 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.012548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.012710 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.115599 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.115686 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.115705 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.115732 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.115752 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.218874 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.218948 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.218970 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.218999 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.219021 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.321443 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.321479 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.321489 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.321502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.321511 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.424533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.424596 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.424606 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.424623 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.424635 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.527594 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.527672 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.527684 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.527699 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.527711 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.630468 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.630533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.630544 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.630563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.630573 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.650236 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:30 crc kubenswrapper[4822]: E1010 06:25:30.650433 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.733204 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.733251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.733261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.733281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.733294 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.835482 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.835564 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.835575 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.835602 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.835618 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.938338 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.938412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.938429 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.938453 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:30 crc kubenswrapper[4822]: I1010 06:25:30.938469 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:30Z","lastTransitionTime":"2025-10-10T06:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.041608 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.041670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.041685 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.041710 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.041724 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.148241 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.148315 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.148331 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.148376 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.148397 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.184590 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.184645 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.184663 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.184685 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.184700 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.198855 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:31Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.203914 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.203994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.204011 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.204033 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.204044 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.217115 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:31Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.221691 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.221742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.221751 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.221773 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.221787 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.234066 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:31Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.238594 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.238643 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.238658 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.238683 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.238696 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.255823 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:31Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.260213 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.260256 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.260269 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.260289 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.260299 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.279550 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:31Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.279695 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.281670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.281705 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.281720 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.281742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.281759 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.383542 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.383590 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.383601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.383618 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.383629 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.487591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.487641 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.487655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.487674 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.487696 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.589886 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.590307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.590385 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.590461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.590564 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.650074 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.650230 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.650075 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.650371 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.650468 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:31 crc kubenswrapper[4822]: E1010 06:25:31.650657 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.694043 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.694375 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.694472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.694567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.694665 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.797193 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.797234 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.797245 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.797262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.797275 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.899654 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.899715 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.899738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.899768 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:31 crc kubenswrapper[4822]: I1010 06:25:31.899793 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:31Z","lastTransitionTime":"2025-10-10T06:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.002121 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.002150 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.002158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.002173 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.002183 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.105010 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.105052 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.105063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.105078 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.105090 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.208086 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.208132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.208142 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.208162 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.208175 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.311123 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.311168 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.311179 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.311201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.311214 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.414156 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.414211 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.414221 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.414239 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.414249 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.516688 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.516754 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.516773 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.516830 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.516850 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.618951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.619197 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.619288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.619359 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.619420 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.649972 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:32 crc kubenswrapper[4822]: E1010 06:25:32.650500 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.650729 4822 scope.go:117] "RemoveContainer" containerID="4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.721922 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.721977 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.721989 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.722009 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.722025 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.825670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.825731 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.825743 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.825762 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.825776 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.929538 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.929578 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.929586 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.929600 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:32 crc kubenswrapper[4822]: I1010 06:25:32.929610 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:32Z","lastTransitionTime":"2025-10-10T06:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.033490 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.033549 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.033563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.033585 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.033597 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.135926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.135971 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.135981 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.136007 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.136019 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.137833 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/2.log" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.140991 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.142404 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.163208 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.188033 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.203423 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.217687 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.233189 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.238987 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.239040 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.239054 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.239074 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.239088 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.245701 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.259785 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.275754 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.288295 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.303823 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.323894 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.335409 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.343005 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.343070 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.343086 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.343108 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.343130 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.372698 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.390907 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.407080 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.427855 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.441860 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.447062 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.447105 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.447117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.447134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.447146 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.458977 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.549577 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.549601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.549609 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.549621 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.549629 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.649999 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:33 crc kubenswrapper[4822]: E1010 06:25:33.650106 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.650328 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.650409 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:33 crc kubenswrapper[4822]: E1010 06:25:33.650467 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:33 crc kubenswrapper[4822]: E1010 06:25:33.650495 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.651850 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.651882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.651892 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.651904 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.651948 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.670487 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.681700 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.695973 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.715922 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.736429 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.754842 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.754918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.754942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.754978 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.755003 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.779056 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.795886 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.810846 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.823982 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.836908 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.855248 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.858115 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.858145 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.858155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.858168 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.858178 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.870346 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.883986 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.895897 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.907332 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.920025 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.932494 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.943415 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.960285 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.960330 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.960343 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.960358 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:33 crc kubenswrapper[4822]: I1010 06:25:33.960369 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:33Z","lastTransitionTime":"2025-10-10T06:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.063551 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.063603 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.063615 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.063632 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.063643 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.148261 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/3.log" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.149238 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/2.log" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.152395 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" exitCode=1 Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.152489 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.152885 4822 scope.go:117] "RemoveContainer" containerID="4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.153790 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:25:34 crc kubenswrapper[4822]: E1010 06:25:34.154111 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.166836 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.166885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.166899 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.166921 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.166953 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.172710 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.186933 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.203060 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.224233 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.242561 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.256932 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.269591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.269641 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.269672 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.269692 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.269703 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.274349 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.293154 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.307263 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.320709 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.337487 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.359353 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.373171 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.373403 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.373506 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.373629 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.373739 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.374574 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.386383 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.404171 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c50e6090f189d38651d6bd4fa273e83587072ac059463d8012c5052eb386731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:01Z\\\",\\\"message\\\":\\\"roller-manager-crc\\\\nI1010 06:25:01.501638 6495 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 in node crc\\\\nF1010 06:25:01.501534 6495 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:01Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:25:01.501648 6495 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-w2fl5 after 0 failed attempt(s)\\\\nI1010 06:25:01.501649 6495 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1010 06:25:01.501640 6495 obj_retry.go:386] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:33Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1010 06:25:33.607871 6857 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1010 06:25:33.607915 6857 factory.go:1336] Added *v1.Node event handler 7\\\\nI1010 06:25:33.607956 6857 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608242 6857 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1010 06:25:33.608334 6857 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1010 06:25:33.608377 6857 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1010 06:25:33.608388 6857 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1010 06:25:33.608403 6857 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1010 06:25:33.608429 6857 factory.go:656] Stopping watch factory\\\\nI1010 06:25:33.608449 6857 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:25:33.608487 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI1010 06:25:33.608505 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI1010 06:25:33.608512 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608525 6857 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:25:33.608629 6857 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.415366 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.428325 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.451451 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.476725 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.476781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.476791 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.476828 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.476897 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.579683 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.579713 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.579721 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.579733 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.579742 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.650045 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:34 crc kubenswrapper[4822]: E1010 06:25:34.650239 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.683214 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.683249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.683257 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.683272 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.683281 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.786347 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.786390 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.786401 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.786420 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.786431 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.889211 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.889271 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.889284 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.889307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.889322 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.992839 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.992910 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.992928 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.992961 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:34 crc kubenswrapper[4822]: I1010 06:25:34.992982 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:34Z","lastTransitionTime":"2025-10-10T06:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.096287 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.096340 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.096357 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.096376 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.096394 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.158083 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/3.log" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.163336 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:25:35 crc kubenswrapper[4822]: E1010 06:25:35.163535 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.185217 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:33Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1010 06:25:33.607871 6857 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1010 06:25:33.607915 6857 factory.go:1336] Added *v1.Node event handler 7\\\\nI1010 06:25:33.607956 6857 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608242 6857 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1010 06:25:33.608334 6857 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1010 06:25:33.608377 6857 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1010 06:25:33.608388 6857 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1010 06:25:33.608403 6857 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1010 06:25:33.608429 6857 factory.go:656] Stopping watch factory\\\\nI1010 06:25:33.608449 6857 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:25:33.608487 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI1010 06:25:33.608505 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI1010 06:25:33.608512 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608525 6857 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:25:33.608629 6857 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.198738 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.200096 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.200156 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.200173 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.200196 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.200209 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.215002 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.237099 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.255686 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.272069 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.285711 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.299208 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.303403 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.303448 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.303460 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.303480 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.303492 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.312723 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.326453 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.338442 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.354197 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.368216 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.381182 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.393193 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.406654 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.406701 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.406713 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.406730 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.406742 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.411831 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.425268 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.436456 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.509504 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.509588 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.509607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.509630 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.509648 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.611962 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.611999 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.612007 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.612023 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.612032 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.650280 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:35 crc kubenswrapper[4822]: E1010 06:25:35.650398 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.650573 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:35 crc kubenswrapper[4822]: E1010 06:25:35.650621 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.650763 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:35 crc kubenswrapper[4822]: E1010 06:25:35.650843 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.713823 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.713867 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.713880 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.713899 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.713913 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.817429 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.817788 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.817908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.817998 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.818070 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.922093 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.922164 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.922180 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.922206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:35 crc kubenswrapper[4822]: I1010 06:25:35.922223 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:35Z","lastTransitionTime":"2025-10-10T06:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.025197 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.025521 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.025584 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.025699 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.025773 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.128359 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.128944 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.128992 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.129014 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.129028 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.231566 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.231639 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.231668 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.231689 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.231699 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.335191 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.335269 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.335282 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.335306 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.335321 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.438305 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.438365 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.438376 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.438397 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.438429 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.541407 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.541452 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.541461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.541482 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.541492 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.650947 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.651063 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: E1010 06:25:36.651088 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.651113 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.651133 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.651158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.651176 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.754335 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.754394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.754407 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.754423 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.754435 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.857993 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.858057 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.858071 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.858091 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.858105 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.960733 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.960776 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.960785 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.960823 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:36 crc kubenswrapper[4822]: I1010 06:25:36.960834 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:36Z","lastTransitionTime":"2025-10-10T06:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.063930 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.063965 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.063984 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.064000 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.064010 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.168206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.168252 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.168262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.168278 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.168288 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.271164 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.271224 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.271241 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.271263 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.271326 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.374208 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.374267 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.374280 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.374297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.374309 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.478086 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.478141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.478152 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.478166 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.478176 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.483912 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.484054 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.484030277 +0000 UTC m=+148.579188473 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.580984 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.581048 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.581062 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.581078 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.581090 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.585669 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.585765 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.585835 4822 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.585870 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.585925 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.585906801 +0000 UTC m=+148.681064997 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.585955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.585973 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586007 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586027 4822 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586042 4822 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586086 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.586068786 +0000 UTC m=+148.681226992 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586104 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586120 4822 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586120 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.586096616 +0000 UTC m=+148.681254852 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586130 4822 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.586198 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.586180459 +0000 UTC m=+148.681338725 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.649670 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.650046 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.650082 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.650179 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.650452 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:37 crc kubenswrapper[4822]: E1010 06:25:37.650551 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.683932 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.683964 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.683972 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.683984 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.683993 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.787439 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.787500 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.787522 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.787550 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.787572 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.889506 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.889556 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.889565 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.889581 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.889590 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.991466 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.991501 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.991509 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.991521 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:37 crc kubenswrapper[4822]: I1010 06:25:37.991531 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:37Z","lastTransitionTime":"2025-10-10T06:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.094150 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.094200 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.094215 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.094230 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.094242 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.196581 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.196624 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.196634 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.196650 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.196662 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.299336 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.299376 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.299387 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.299404 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.299416 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.401967 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.402001 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.402013 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.402030 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.402042 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.505739 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.506124 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.506255 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.506394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.506578 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.609187 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.609243 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.609260 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.609284 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.609319 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.649732 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:38 crc kubenswrapper[4822]: E1010 06:25:38.649913 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.711688 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.711748 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.711766 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.711788 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.711866 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.814469 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.814746 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.815052 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.815124 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.815187 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.917230 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.917299 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.917307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.917322 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:38 crc kubenswrapper[4822]: I1010 06:25:38.917332 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:38Z","lastTransitionTime":"2025-10-10T06:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.020519 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.021042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.021191 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.021316 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.021436 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.124015 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.124421 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.124614 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.124825 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.125038 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.228936 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.229005 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.229021 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.229049 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.229070 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.332079 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.332132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.332144 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.332163 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.332182 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.434449 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.434491 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.434500 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.434514 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.434524 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.536248 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.536284 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.536294 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.536308 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.536319 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.638504 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.638552 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.638568 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.638589 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.638605 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.649621 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.649676 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.649623 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:39 crc kubenswrapper[4822]: E1010 06:25:39.649847 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:39 crc kubenswrapper[4822]: E1010 06:25:39.649991 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:39 crc kubenswrapper[4822]: E1010 06:25:39.650074 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.741157 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.741239 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.741265 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.741298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.741321 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.844041 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.844119 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.844132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.844175 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.844187 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.947408 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.947764 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.947908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.947978 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:39 crc kubenswrapper[4822]: I1010 06:25:39.948043 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:39Z","lastTransitionTime":"2025-10-10T06:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.050475 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.050527 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.050538 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.050550 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.050559 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.152855 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.152926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.152939 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.152956 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.152968 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.255679 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.255720 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.255732 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.255746 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.255759 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.358451 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.358533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.358546 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.358583 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.358599 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.461774 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.461854 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.461865 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.461881 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.461893 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.565614 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.565714 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.565738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.565767 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.565790 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.649671 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:40 crc kubenswrapper[4822]: E1010 06:25:40.649991 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.669995 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.670081 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.670120 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.670139 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.670152 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.772291 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.772366 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.772379 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.772396 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.772407 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.875479 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.875523 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.875544 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.875567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.875582 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.978610 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.978659 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.978676 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.978694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:40 crc kubenswrapper[4822]: I1010 06:25:40.978707 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:40Z","lastTransitionTime":"2025-10-10T06:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.081887 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.081920 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.081930 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.081945 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.081959 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.184930 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.184996 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.185013 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.185042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.185059 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.288049 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.288110 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.288124 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.288143 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.288155 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.390671 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.390973 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.391117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.391214 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.391323 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.493426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.493487 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.493498 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.493517 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.493535 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.596168 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.596441 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.596532 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.596648 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.596750 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.628506 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.628735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.628794 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.628886 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.629011 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.645651 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.649783 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.649837 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.650072 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.650201 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.650379 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.650697 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.650827 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.650935 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.651022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.651102 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.653077 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.668959 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.673566 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.673718 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.673864 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.674155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.674242 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.686694 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.691190 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.691221 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.691232 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.691249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.691263 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.709164 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.714219 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.714257 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.714279 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.714299 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.714313 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.731616 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:41 crc kubenswrapper[4822]: E1010 06:25:41.731801 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.738601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.738664 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.738686 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.738715 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.738738 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.842018 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.842090 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.842113 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.842140 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.842163 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.945621 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.945688 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.945711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.945739 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:41 crc kubenswrapper[4822]: I1010 06:25:41.945761 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:41Z","lastTransitionTime":"2025-10-10T06:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.049315 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.049371 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.049391 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.049420 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.049442 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.152371 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.152440 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.152463 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.152488 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.152507 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.256402 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.256449 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.256461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.256480 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.256492 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.360407 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.360492 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.360515 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.360548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.360573 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.463229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.463296 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.463320 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.463351 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.463372 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.566158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.566237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.566262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.566333 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.566358 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.649227 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:42 crc kubenswrapper[4822]: E1010 06:25:42.650362 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.670183 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.670224 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.670237 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.670252 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.670264 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.772605 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.772686 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.772710 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.772732 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.772750 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.875675 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.875719 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.875727 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.875741 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.875751 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.977986 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.978034 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.978048 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.978068 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:42 crc kubenswrapper[4822]: I1010 06:25:42.978079 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:42Z","lastTransitionTime":"2025-10-10T06:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.080508 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.080541 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.080552 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.080567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.080579 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.183693 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.183728 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.183739 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.183755 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.183765 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.286342 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.286380 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.286393 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.286414 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.286426 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.388406 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.388453 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.388463 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.388485 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.388497 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.491280 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.491406 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.491479 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.491513 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.491596 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.594763 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.594837 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.594853 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.594868 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.594880 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.649870 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.649918 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.649870 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:43 crc kubenswrapper[4822]: E1010 06:25:43.650121 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:43 crc kubenswrapper[4822]: E1010 06:25:43.650030 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:43 crc kubenswrapper[4822]: E1010 06:25:43.650282 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.664865 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.683068 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:33Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1010 06:25:33.607871 6857 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1010 06:25:33.607915 6857 factory.go:1336] Added *v1.Node event handler 7\\\\nI1010 06:25:33.607956 6857 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608242 6857 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1010 06:25:33.608334 6857 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1010 06:25:33.608377 6857 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1010 06:25:33.608388 6857 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1010 06:25:33.608403 6857 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1010 06:25:33.608429 6857 factory.go:656] Stopping watch factory\\\\nI1010 06:25:33.608449 6857 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:25:33.608487 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI1010 06:25:33.608505 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI1010 06:25:33.608512 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608525 6857 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:25:33.608629 6857 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.695001 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.696694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.696718 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.696726 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.696738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.696747 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.718781 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.733562 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.749712 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.767965 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.783303 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.799951 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.800004 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.800016 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.800034 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.800048 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.801006 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.817213 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.831966 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.841913 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.855687 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.867258 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.877553 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.886020 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.898966 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.902306 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.902340 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.902351 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.902370 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.902381 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:43Z","lastTransitionTime":"2025-10-10T06:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:43 crc kubenswrapper[4822]: I1010 06:25:43.911090 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.005259 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.005428 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.005497 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.005575 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.005684 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.107913 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.108158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.108219 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.108284 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.108350 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.210885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.211321 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.211471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.211614 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.211747 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.314626 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.315003 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.315109 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.315207 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.315334 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.417923 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.418632 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.418934 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.419199 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.419403 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.522591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.522660 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.522683 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.522714 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.522736 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.625104 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.625175 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.625189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.625208 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.625220 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.649690 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:44 crc kubenswrapper[4822]: E1010 06:25:44.649870 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.728104 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.728170 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.728182 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.728206 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.728222 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.831069 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.831119 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.831131 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.831152 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.831164 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.934768 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.934831 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.934841 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.934858 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:44 crc kubenswrapper[4822]: I1010 06:25:44.934871 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:44Z","lastTransitionTime":"2025-10-10T06:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.037424 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.037472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.037483 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.037503 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.037519 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.140122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.140174 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.140184 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.140201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.140222 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.242649 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.242701 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.242711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.242730 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.242742 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.346014 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.346056 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.346067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.346085 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.346098 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.449550 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.449602 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.449613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.449629 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.449645 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.552585 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.552651 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.552660 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.552682 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.552697 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.650109 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.650261 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.650326 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:45 crc kubenswrapper[4822]: E1010 06:25:45.650476 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:45 crc kubenswrapper[4822]: E1010 06:25:45.650643 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:45 crc kubenswrapper[4822]: E1010 06:25:45.650867 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.654922 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.654970 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.654986 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.655010 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.655022 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.758151 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.758194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.758207 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.758223 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.758428 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.861396 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.861444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.861456 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.861472 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.861487 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.963293 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.963346 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.963358 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.963386 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:45 crc kubenswrapper[4822]: I1010 06:25:45.963398 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:45Z","lastTransitionTime":"2025-10-10T06:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.066243 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.066282 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.066292 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.066307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.066319 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.168856 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.168917 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.168928 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.168946 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.168958 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.271928 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.272022 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.272039 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.272059 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.272074 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.374319 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.374386 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.374400 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.374442 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.374458 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.477050 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.477107 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.477117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.477134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.477145 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.579766 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.579863 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.579877 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.579895 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.579927 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.649541 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:46 crc kubenswrapper[4822]: E1010 06:25:46.649703 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.662734 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.682638 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.682986 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.683108 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.683220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.683310 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.786456 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.786509 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.786559 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.786576 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.786588 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.888916 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.888989 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.889004 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.889025 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.889040 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.991184 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.991232 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.991242 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.991262 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:46 crc kubenswrapper[4822]: I1010 06:25:46.991274 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:46Z","lastTransitionTime":"2025-10-10T06:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.094502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.094558 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.094572 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.094594 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.094609 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.197357 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.197410 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.197422 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.197445 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.197457 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.299353 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.299400 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.299414 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.299431 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.299443 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.402655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.402701 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.402718 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.402746 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.402764 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.505971 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.506015 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.506027 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.506046 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.506058 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.608969 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.609025 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.609042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.609059 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.609069 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.649987 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:47 crc kubenswrapper[4822]: E1010 06:25:47.650163 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.650194 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.650256 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:47 crc kubenswrapper[4822]: E1010 06:25:47.650279 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:47 crc kubenswrapper[4822]: E1010 06:25:47.650507 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.711833 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.711893 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.711907 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.711924 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.711936 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.815412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.815484 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.815502 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.815529 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.815548 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.918159 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.918201 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.918210 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.918224 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:47 crc kubenswrapper[4822]: I1010 06:25:47.918234 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:47Z","lastTransitionTime":"2025-10-10T06:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.020297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.020357 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.020367 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.020386 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.020399 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.122769 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.122857 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.122876 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.122896 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.122911 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.225222 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.225281 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.225291 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.225314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.225328 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.331484 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.331543 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.331556 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.331576 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.331594 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.434094 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.434181 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.434194 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.434220 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.434236 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.537471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.537903 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.537915 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.537930 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.537940 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.640563 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.640605 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.640615 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.640630 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.640641 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.650105 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:48 crc kubenswrapper[4822]: E1010 06:25:48.650291 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.742910 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.742953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.742961 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.742974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.742982 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.845389 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.845426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.845435 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.845451 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.845460 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.948860 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.949130 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.949207 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.949275 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:48 crc kubenswrapper[4822]: I1010 06:25:48.949353 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:48Z","lastTransitionTime":"2025-10-10T06:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.052323 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.052368 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.052380 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.052399 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.052411 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.155037 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.155094 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.155102 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.155115 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.155124 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.257877 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.257918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.257928 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.257944 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.257954 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.360837 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.360889 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.360898 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.360914 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.360925 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.463496 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.463547 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.463559 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.463577 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.463589 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.566004 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.566047 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.566059 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.566075 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.566086 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.650033 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:49 crc kubenswrapper[4822]: E1010 06:25:49.650157 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.650328 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:49 crc kubenswrapper[4822]: E1010 06:25:49.650371 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.650545 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:49 crc kubenswrapper[4822]: E1010 06:25:49.650602 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.668441 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.668473 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.668481 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.668494 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.668505 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.770503 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.770918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.771075 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.771218 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.771359 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.873104 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.873134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.873144 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.873158 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.873167 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.975526 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.975568 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.975577 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.975591 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:49 crc kubenswrapper[4822]: I1010 06:25:49.975600 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:49Z","lastTransitionTime":"2025-10-10T06:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.077534 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.077573 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.077582 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.077596 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.077605 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.180032 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.180089 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.180101 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.180118 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.180130 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.282388 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.282433 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.282444 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.282458 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.282467 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.384662 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.384722 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.384738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.384766 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.384784 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.487911 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.487959 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.487967 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.487980 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.487989 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.590468 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.590528 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.590539 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.590555 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.590565 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.649453 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:50 crc kubenswrapper[4822]: E1010 06:25:50.649713 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.650265 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:25:50 crc kubenswrapper[4822]: E1010 06:25:50.650394 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.693115 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.693156 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.693167 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.693181 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.693222 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.796307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.796363 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.796384 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.796413 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.796434 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.898830 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.898903 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.898926 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.898956 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:50 crc kubenswrapper[4822]: I1010 06:25:50.898977 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:50Z","lastTransitionTime":"2025-10-10T06:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.001535 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.001574 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.001585 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.001600 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.001610 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.104369 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.104716 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.104974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.105172 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.105376 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.207587 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.208003 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.208240 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.208456 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.208664 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.311578 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.311638 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.311656 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.311679 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.311696 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.414922 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.414985 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.415007 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.415035 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.415058 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.517874 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.517938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.517949 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.517967 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.517980 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.620601 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.620660 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.620673 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.620691 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.620707 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.649417 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.649467 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.649444 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:51 crc kubenswrapper[4822]: E1010 06:25:51.649591 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:51 crc kubenswrapper[4822]: E1010 06:25:51.649757 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:51 crc kubenswrapper[4822]: E1010 06:25:51.649938 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.723546 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.723645 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.723661 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.723685 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.723704 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.826207 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.826247 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.826255 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.826269 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.826278 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.930217 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.930333 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.930347 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.930369 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:51 crc kubenswrapper[4822]: I1010 06:25:51.930385 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:51Z","lastTransitionTime":"2025-10-10T06:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.033769 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.033856 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.033868 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.033896 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.033911 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.118186 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.118222 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.118230 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.118244 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.118254 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.132355 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.137098 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.137164 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.137181 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.137203 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.137248 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.151653 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.156195 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.156872 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.156885 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.156898 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.156908 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.170574 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.174293 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.174338 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.174348 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.174360 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.174371 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.185972 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.189936 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.189971 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.189981 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.189995 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.190009 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.202277 4822 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"24930614-984a-4687-af00-12fa4519901f\\\",\\\"systemUUID\\\":\\\"b9d6aaf9-9893-464a-9c1f-35cedc127eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.202417 4822 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.204350 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.204409 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.204418 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.204439 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.204450 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.307135 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.307172 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.307183 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.307197 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.307207 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.413554 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.413617 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.413633 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.413655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.413672 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.516248 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.516299 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.516314 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.516335 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.516349 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.619412 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.619486 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.619510 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.619540 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.619563 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.650347 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.650544 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.722433 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.722511 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.722533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.722565 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.722586 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.825711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.825849 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.825870 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.825897 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.825914 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.929309 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.929388 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.929413 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.929446 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.929472 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:52Z","lastTransitionTime":"2025-10-10T06:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:52 crc kubenswrapper[4822]: I1010 06:25:52.949153 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.949320 4822 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:25:52 crc kubenswrapper[4822]: E1010 06:25:52.949380 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs podName:8a5c431a-2c94-41ca-aba2-c7a04c4908db nodeName:}" failed. No retries permitted until 2025-10-10 06:26:56.949364223 +0000 UTC m=+164.044522419 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs") pod "network-metrics-daemon-25l92" (UID: "8a5c431a-2c94-41ca-aba2-c7a04c4908db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.032580 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.032632 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.032642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.032661 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.032672 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.135079 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.135122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.135133 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.135146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.135156 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.238084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.238179 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.238211 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.238246 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.238274 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.340551 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.340598 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.340620 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.340639 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.340654 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.443473 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.443570 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.443609 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.443642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.443663 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.547191 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.547264 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.547288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.547321 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.547346 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.649365 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.649416 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.649512 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:53 crc kubenswrapper[4822]: E1010 06:25:53.649694 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:53 crc kubenswrapper[4822]: E1010 06:25:53.650067 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:53 crc kubenswrapper[4822]: E1010 06:25:53.650365 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.650381 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.650413 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.650428 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.650447 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.650463 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.666149 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.680784 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d4a6cb7-61d8-4c6b-bb8d-2eed42f0b0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa57f479d871378f49b7e4d7d3533b3092523583c575255805a9932d1471adba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df559d4e56e5bce9d356a1a252a48076e1946ef756eca72cbad8d3ee3024135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4bce591679a3fd7b9fe66d943cc9f80b0a21d4958a4cfa9510d1c1fc49d902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9774d0bddc00e25282de64d37ebf19097089b19c4d806a034a3db2863f896c90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a314b15502646a1683c018935a35b7ca2f2706e3d034576d064022dfcfdb3339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c81b7957a0bfdc4e09e2dacd2f02a4d5191ffc754f7ab03a06e439656548567\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9210750a1dd35b9627ae2cf313d5955d1d34d4be5dbb358b55776d6f1c176e22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cq8kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrdcs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.694946 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6be06d9-ad0f-4110-bba3-962524886f08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a904d3d0abc3588816cc2fa981bd46e3c82568011c563fa984778ce30db7fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81c6f538d05fcfe7ad58b973b5f646196b1f1d7f39fea5b110af1a12e7161e0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dnksp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9bczm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.706344 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-25l92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a5c431a-2c94-41ca-aba2-c7a04c4908db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srss6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-25l92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.721161 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31386b14eac0efd5b2de8c912d644df9b6ca68636e0e2a441963b487b0e4ac9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.738743 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed3cb35-d91d-498d-8461-1df09c00272a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c448ca3613025b939025b7254d3d341ddf46d1ec3011469165d7f2159fd8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c843a79a94712c3acd885d51ebb0d23b636a02cd2e6d578c7f6ffaa636a7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://911a099aba0f2e84291b74ac308c778b84c5717501ecab0d8dfaf02804dfb270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0601ad33f41cd6e50a2b6c2604cffc294a2254dd111c4ca34d46e144dfa75dec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.753047 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4cc7814-2b06-45ae-b8d7-f7571d24084b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfeb32c46a971b85a6bd45b3be31ed44b2f6e0c2acaad901258f1e5e8121c168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9987060dac4145465b8c470fd323b0655f60275926c0ac851be664c7a61e6b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a89f57dc5734c81b1a02fe1417e7ee8398996489df90f533bb7f49bbedb4699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5718dd633a340c2d6d9eff727b162d470cb281b327d90cc656c4b13a3a78540\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.753557 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.753742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.753860 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.753940 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.754115 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.767345 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.782888 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.798825 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1143e05ffedebe00b0c7c281c85e1eab66f860117c456cd592bb79d8bd511f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.811201 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-889pc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f12f36f-6d53-433a-9604-e70600ccdbe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://429e687414ba1048b56e6a1257157e051bae29b0114d584113d0d9816534c873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xd2z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-889pc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.829679 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5x2kt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec9c77cf-dd02-4e39-b204-9f6540406973\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:25:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:22Z\\\",\\\"message\\\":\\\"2025-10-10T06:24:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f\\\\n2025-10-10T06:24:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2757d6d4-8f51-4c3f-81d8-d73646c0b09f to /host/opt/cni/bin/\\\\n2025-10-10T06:24:37Z [verbose] multus-daemon started\\\\n2025-10-10T06:24:37Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:25:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:25:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z9t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5x2kt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.846595 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c139ecf2-36e5-4fac-bbce-cebf1ce97d0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://962d3b0e94d620dd141f866d74fa2e45400f8fe98e27fd5270b43f58d50dedc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed067afebc0041ffd40a3c3dfc1843fa7beb79b9d8c6c6a977bb1afdbe040e45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00818923ab3ce9dc0126a4bdef4dd5d16bcae483e821dbb32a67c57ff6fede7e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a64bafc8a1c8edd6b1ed57701928852abcc80bd9406225448304b9bee04d3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19d181046356a1a0cc4a0061ba7a098bb881555bc3a25608242f7498adf2725\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:24:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:24:28.241273 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:24:28.242031 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3243499720/tls.crt::/tmp/serving-cert-3243499720/tls.key\\\\\\\"\\\\nI1010 06:24:33.635020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:24:33.638686 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:24:33.638703 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:24:33.638728 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:24:33.638733 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:24:33.652974 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:24:33.652992 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:24:33.653012 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:24:33.653034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:24:33.653041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:24:33.653046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:24:33.653052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:24:33.659016 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84bb618b4268161050e8ac5c36e2fcf3e0db06b49621cf423b47f3f62f782c70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://151c4ce0eb02fdc7bf35cf4aa477c39d9f1a7c5779b302f057fbc88cca77f4ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.858466 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.858529 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.858544 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.858570 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.858584 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.862405 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86167202-f72a-4271-bdbe-32ba0bf71fff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c596946867a459918243a1311e83774a613abca1820731e59dd08e6f4b3361b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t4hhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w2fl5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.884543 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba73c65764abba87b3607f92e052698ea8a4b83a11f52f0ee748971e9e5d2902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06f6ea17c0729bab3d0cd4b876f30381759d172282cae775841eb10cbfb94248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.911584 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bd611ad-9a8c-489f-903b-d75912bb1fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:25:33Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI1010 06:25:33.607871 6857 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1010 06:25:33.607915 6857 factory.go:1336] Added *v1.Node event handler 7\\\\nI1010 06:25:33.607956 6857 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608242 6857 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1010 06:25:33.608334 6857 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1010 06:25:33.608377 6857 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1010 06:25:33.608388 6857 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1010 06:25:33.608403 6857 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1010 06:25:33.608429 6857 factory.go:656] Stopping watch factory\\\\nI1010 06:25:33.608449 6857 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:25:33.608487 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI1010 06:25:33.608505 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI1010 06:25:33.608512 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1010 06:25:33.608525 6857 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:25:33.608629 6857 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:25:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cngm6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bzbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.924482 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kwt79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e46a09c3-aa28-4097-bd15-5fe82d308dad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72c1009ec1dcd8540f953162af3bf4a1587c80ab217ad7d6fc51bad1b356c008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k44qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kwt79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.937629 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57e867f0-6eb6-47f1-ba30-b2cb6d8f8f86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137f964799e4e133b15ded88d77b866d8f21a6ca1629315307693070e9248b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://182e0f89b8f3646cc06497e0c6b03697a690f0b702eb6d4c2b9d793fa4b4788a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://182e0f89b8f3646cc06497e0c6b03697a690f0b702eb6d4c2b9d793fa4b4788a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.962221 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.962287 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.962297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.962333 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.962347 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:53Z","lastTransitionTime":"2025-10-10T06:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:53 crc kubenswrapper[4822]: I1010 06:25:53.963765 4822 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd37586-a4e6-461e-9f9e-c2ebd777f438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55a954e11bbadd30d8dd7635d9f835b9775da2c33003618d5987d0d696a1b927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb684c160aa83ad20dad05c050acb7d161a7265de7d440e8be882a9d355b1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3175ecefb7e6c8997ffa7e0fc93c17ac259de116c94863620b43ca48ece0759d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fea084b7775fb1128c46c1beb135f0dcf17b7230f5ece3eabf4b4e5078fdd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26075302dcf30becf7aacbf178001535bda407fc6172d1108c8c62166a996145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f009a3b3d0711438363e189a62c532763c9e5c7690bf7e44d5884bee579ef5d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ce30f8d6a441d4e2114c75e339359fd2d8504928d0f83ffccce15d58a5dca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://690fabf59b903afb3b7d69b0b126327886b657010e2da3e05e913f1f46a1623a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:24:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:24:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:25:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.065670 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.065734 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.065743 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.065765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.065778 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.169199 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.169242 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.169251 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.169265 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.169275 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.271994 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.272097 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.272109 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.272155 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.272171 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.377976 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.378026 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.378037 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.378056 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.378069 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.481143 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.481189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.481198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.481216 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.481232 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.584628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.584681 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.584692 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.584715 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.584730 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.650125 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:54 crc kubenswrapper[4822]: E1010 06:25:54.650301 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.691956 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.692045 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.692058 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.692085 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.692102 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.795832 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.795895 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.795909 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.795930 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.795947 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.898898 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.898988 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.899002 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.899027 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:54 crc kubenswrapper[4822]: I1010 06:25:54.899040 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:54Z","lastTransitionTime":"2025-10-10T06:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.002574 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.002632 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.002644 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.002666 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.002681 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.105856 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.105908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.105918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.105931 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.105940 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.208653 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.208709 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.208722 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.208738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.208749 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.310675 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.310742 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.310753 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.310771 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.310784 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.413567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.413617 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.413629 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.413647 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.413659 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.517234 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.517286 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.517297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.517316 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.517329 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.620426 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.620471 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.620480 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.620499 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.620516 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.650068 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.650139 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:55 crc kubenswrapper[4822]: E1010 06:25:55.650203 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:55 crc kubenswrapper[4822]: E1010 06:25:55.650294 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.650476 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:55 crc kubenswrapper[4822]: E1010 06:25:55.650527 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.723847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.723901 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.723911 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.723938 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.723952 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.827613 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.827655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.827663 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.827679 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.827692 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.930501 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.930559 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.930578 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.930600 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:55 crc kubenswrapper[4822]: I1010 06:25:55.930615 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:55Z","lastTransitionTime":"2025-10-10T06:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.033728 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.033775 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.033790 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.033831 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.033848 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.136825 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.136879 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.136891 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.136908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.136918 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.240718 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.240776 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.240791 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.240831 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.240846 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.344548 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.344606 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.344617 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.344636 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.344648 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.447610 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.447669 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.447686 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.447707 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.447724 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.550595 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.550655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.550671 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.550693 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.550711 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.650233 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:56 crc kubenswrapper[4822]: E1010 06:25:56.650484 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.653227 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.653290 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.653304 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.653345 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.653359 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.755946 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.755996 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.756014 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.756038 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.756050 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.858468 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.858518 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.858533 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.858557 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.858569 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.961398 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.961437 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.961445 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.961459 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:56 crc kubenswrapper[4822]: I1010 06:25:56.961468 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:56Z","lastTransitionTime":"2025-10-10T06:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.064297 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.064370 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.064381 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.064400 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.064411 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.167315 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.167381 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.167394 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.167414 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.167427 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.270530 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.270576 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.270587 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.270607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.270618 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.373084 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.373131 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.373146 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.373165 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.373178 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.475874 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.475913 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.475922 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.475941 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.475952 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.579006 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.579058 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.579067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.579085 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.579097 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.651061 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.651057 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:57 crc kubenswrapper[4822]: E1010 06:25:57.651295 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:57 crc kubenswrapper[4822]: E1010 06:25:57.651205 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.651077 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:57 crc kubenswrapper[4822]: E1010 06:25:57.651384 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.682210 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.682305 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.682325 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.682628 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.682925 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.785247 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.785318 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.785340 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.785372 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.785395 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.888132 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.888176 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.888189 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.888208 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.888544 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.990655 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.990688 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.990698 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.990711 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:57 crc kubenswrapper[4822]: I1010 06:25:57.990720 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:57Z","lastTransitionTime":"2025-10-10T06:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.093230 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.093279 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.093291 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.093312 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.093329 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.195942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.195987 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.195998 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.196013 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.196023 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.298019 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.298070 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.298099 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.298117 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.298128 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.400720 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.400768 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.400784 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.400837 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.400857 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.504229 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.504288 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.504307 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.504328 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.504345 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.607629 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.607680 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.607689 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.607703 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.607715 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.649717 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:25:58 crc kubenswrapper[4822]: E1010 06:25:58.649978 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.709429 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.709461 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.709470 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.709483 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.709493 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.811747 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.811779 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.811787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.811824 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.811838 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.914042 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.914076 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.914086 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.914100 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:58 crc kubenswrapper[4822]: I1010 06:25:58.914108 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:58Z","lastTransitionTime":"2025-10-10T06:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.016514 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.016567 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.016586 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.016607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.016622 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.118619 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.118657 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.118666 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.118684 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.118694 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.221626 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.221667 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.221679 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.221694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.221703 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.324311 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.324380 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.324393 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.324413 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.324425 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.426480 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.426512 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.426523 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.426537 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.426545 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.528765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.528862 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.528882 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.528908 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.528925 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.631910 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.631974 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.631992 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.632016 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.632036 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.649701 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.649784 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.649871 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:25:59 crc kubenswrapper[4822]: E1010 06:25:59.650008 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:25:59 crc kubenswrapper[4822]: E1010 06:25:59.650121 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:25:59 crc kubenswrapper[4822]: E1010 06:25:59.650223 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.734173 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.734230 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.734241 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.734266 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.734281 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.837622 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.837683 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.837698 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.837719 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.837731 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.940738 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.940828 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.940845 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.940871 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:25:59 crc kubenswrapper[4822]: I1010 06:25:59.940885 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:25:59Z","lastTransitionTime":"2025-10-10T06:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.042746 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.042815 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.042830 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.042849 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.042863 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.144844 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.144903 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.144921 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.144942 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.144957 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.246913 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.246959 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.246970 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.246985 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.246996 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.349214 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.349241 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.349249 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.349261 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.349269 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.451574 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.451629 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.451642 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.451659 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.451671 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.554071 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.554113 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.554124 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.554141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.554154 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.649274 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:00 crc kubenswrapper[4822]: E1010 06:26:00.649453 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.656064 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.656110 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.656122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.656138 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.656152 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.758745 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.758814 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.758829 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.758848 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.758857 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.862091 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.862198 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.862456 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.862535 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.862768 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.966694 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.966753 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.966766 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.966781 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:00 crc kubenswrapper[4822]: I1010 06:26:00.966797 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:00Z","lastTransitionTime":"2025-10-10T06:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.070739 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.070860 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.070884 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.071350 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.071624 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.174946 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.175057 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.175083 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.175105 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.175120 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.277495 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.277531 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.277542 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.277562 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.277570 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.380057 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.380122 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.380134 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.380153 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.380165 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.482871 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.482918 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.482930 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.482945 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.482954 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.584765 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.584833 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.584847 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.584863 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.584874 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.649599 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.649649 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:01 crc kubenswrapper[4822]: E1010 06:26:01.649948 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.649971 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:01 crc kubenswrapper[4822]: E1010 06:26:01.650046 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:01 crc kubenswrapper[4822]: E1010 06:26:01.650159 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.687485 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.687522 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.687531 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.687545 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.687561 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.790028 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.790067 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.790102 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.790138 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.790150 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.892933 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.892997 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.893021 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.893045 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.893063 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.995088 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.995141 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.995152 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.995199 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:01 crc kubenswrapper[4822]: I1010 06:26:01.995214 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:01Z","lastTransitionTime":"2025-10-10T06:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.097450 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.097492 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.097509 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.097524 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.097535 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.199878 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.199927 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.199937 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.199953 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.199964 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.302874 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.302931 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.302944 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.302956 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.302965 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.405844 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.405884 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.405892 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.405905 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.405914 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.509298 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.509356 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.509372 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.509399 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.509411 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.588674 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.588735 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.588757 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.588787 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.588839 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.618529 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.618597 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.618607 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.618623 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.618635 4822 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:26:02Z","lastTransitionTime":"2025-10-10T06:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.649567 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:02 crc kubenswrapper[4822]: E1010 06:26:02.649766 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.652671 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv"] Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.653113 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.656997 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.657054 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.657183 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.657336 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.698086 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nrdcs" podStartSLOduration=88.698061211 podStartE2EDuration="1m28.698061211s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.682433795 +0000 UTC m=+109.777592001" watchObservedRunningTime="2025-10-10 06:26:02.698061211 +0000 UTC m=+109.793219427" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.698332 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9bczm" podStartSLOduration=87.698323139 podStartE2EDuration="1m27.698323139s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.697379141 +0000 UTC m=+109.792537367" watchObservedRunningTime="2025-10-10 06:26:02.698323139 +0000 UTC m=+109.793481345" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.755329 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.755307696 podStartE2EDuration="53.755307696s" podCreationTimestamp="2025-10-10 06:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.754927524 +0000 UTC m=+109.850085740" watchObservedRunningTime="2025-10-10 06:26:02.755307696 +0000 UTC m=+109.850465892" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.761536 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4ea901f6-3cfe-4fba-9be0-35243d5df583-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.761578 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea901f6-3cfe-4fba-9be0-35243d5df583-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.761600 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ea901f6-3cfe-4fba-9be0-35243d5df583-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.761663 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea901f6-3cfe-4fba-9be0-35243d5df583-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.761722 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4ea901f6-3cfe-4fba-9be0-35243d5df583-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.827763 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-889pc" podStartSLOduration=88.827742643 podStartE2EDuration="1m28.827742643s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.812686014 +0000 UTC m=+109.907844210" watchObservedRunningTime="2025-10-10 06:26:02.827742643 +0000 UTC m=+109.922900839" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.828271 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5x2kt" podStartSLOduration=88.828265029 podStartE2EDuration="1m28.828265029s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.827739503 +0000 UTC m=+109.922897719" watchObservedRunningTime="2025-10-10 06:26:02.828265029 +0000 UTC m=+109.923423225" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.845434 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.845416241 podStartE2EDuration="1m29.845416241s" podCreationTimestamp="2025-10-10 06:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.843910365 +0000 UTC m=+109.939068581" watchObservedRunningTime="2025-10-10 06:26:02.845416241 +0000 UTC m=+109.940574437" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.860422 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.860404458 podStartE2EDuration="1m22.860404458s" podCreationTimestamp="2025-10-10 06:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.860235263 +0000 UTC m=+109.955393469" watchObservedRunningTime="2025-10-10 06:26:02.860404458 +0000 UTC m=+109.955562654" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862415 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea901f6-3cfe-4fba-9be0-35243d5df583-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862482 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4ea901f6-3cfe-4fba-9be0-35243d5df583-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862520 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4ea901f6-3cfe-4fba-9be0-35243d5df583-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862544 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea901f6-3cfe-4fba-9be0-35243d5df583-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862566 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ea901f6-3cfe-4fba-9be0-35243d5df583-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862679 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4ea901f6-3cfe-4fba-9be0-35243d5df583-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.862688 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4ea901f6-3cfe-4fba-9be0-35243d5df583-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.863433 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ea901f6-3cfe-4fba-9be0-35243d5df583-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.869118 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea901f6-3cfe-4fba-9be0-35243d5df583-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.872974 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podStartSLOduration=88.8729619 podStartE2EDuration="1m28.8729619s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.872536338 +0000 UTC m=+109.967694544" watchObservedRunningTime="2025-10-10 06:26:02.8729619 +0000 UTC m=+109.968120096" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.881463 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ea901f6-3cfe-4fba-9be0-35243d5df583-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t9qrv\" (UID: \"4ea901f6-3cfe-4fba-9be0-35243d5df583\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.906419 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kwt79" podStartSLOduration=88.906400649 podStartE2EDuration="1m28.906400649s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.906281466 +0000 UTC m=+110.001439682" watchObservedRunningTime="2025-10-10 06:26:02.906400649 +0000 UTC m=+110.001558845" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.932893 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.932872886 podStartE2EDuration="16.932872886s" podCreationTimestamp="2025-10-10 06:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.917867009 +0000 UTC m=+110.013025225" watchObservedRunningTime="2025-10-10 06:26:02.932872886 +0000 UTC m=+110.028031082" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.955872 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=88.955856176 podStartE2EDuration="1m28.955856176s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:02.95532012 +0000 UTC m=+110.050478326" watchObservedRunningTime="2025-10-10 06:26:02.955856176 +0000 UTC m=+110.051014362" Oct 10 06:26:02 crc kubenswrapper[4822]: I1010 06:26:02.971155 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" Oct 10 06:26:03 crc kubenswrapper[4822]: I1010 06:26:03.251669 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" event={"ID":"4ea901f6-3cfe-4fba-9be0-35243d5df583","Type":"ContainerStarted","Data":"4081527a55c4cbb3ce3b64311972829a8b0ac017cdeda70e97fb3aca9beef3ac"} Oct 10 06:26:03 crc kubenswrapper[4822]: I1010 06:26:03.252047 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" event={"ID":"4ea901f6-3cfe-4fba-9be0-35243d5df583","Type":"ContainerStarted","Data":"f61a2e536ab57fe4c50cc3a7e1d9447dbb7d43bb72e66b7248b720977272ec1a"} Oct 10 06:26:03 crc kubenswrapper[4822]: I1010 06:26:03.267054 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9qrv" podStartSLOduration=89.267031058 podStartE2EDuration="1m29.267031058s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:03.266416229 +0000 UTC m=+110.361574455" watchObservedRunningTime="2025-10-10 06:26:03.267031058 +0000 UTC m=+110.362189264" Oct 10 06:26:03 crc kubenswrapper[4822]: I1010 06:26:03.649916 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:03 crc kubenswrapper[4822]: I1010 06:26:03.649959 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:03 crc kubenswrapper[4822]: E1010 06:26:03.651966 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:03 crc kubenswrapper[4822]: I1010 06:26:03.652296 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:03 crc kubenswrapper[4822]: E1010 06:26:03.652454 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:03 crc kubenswrapper[4822]: E1010 06:26:03.652815 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:04 crc kubenswrapper[4822]: I1010 06:26:04.649896 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:04 crc kubenswrapper[4822]: E1010 06:26:04.650095 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:05 crc kubenswrapper[4822]: I1010 06:26:05.649400 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:05 crc kubenswrapper[4822]: I1010 06:26:05.649480 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:05 crc kubenswrapper[4822]: E1010 06:26:05.649564 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:05 crc kubenswrapper[4822]: I1010 06:26:05.649813 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:05 crc kubenswrapper[4822]: E1010 06:26:05.649793 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:05 crc kubenswrapper[4822]: E1010 06:26:05.650281 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:05 crc kubenswrapper[4822]: I1010 06:26:05.650554 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:26:05 crc kubenswrapper[4822]: E1010 06:26:05.650765 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6bzbn_openshift-ovn-kubernetes(2bd611ad-9a8c-489f-903b-d75912bb1fef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" Oct 10 06:26:06 crc kubenswrapper[4822]: I1010 06:26:06.649779 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:06 crc kubenswrapper[4822]: E1010 06:26:06.649926 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:07 crc kubenswrapper[4822]: I1010 06:26:07.649695 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:07 crc kubenswrapper[4822]: I1010 06:26:07.649743 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:07 crc kubenswrapper[4822]: I1010 06:26:07.649766 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:07 crc kubenswrapper[4822]: E1010 06:26:07.649879 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:07 crc kubenswrapper[4822]: E1010 06:26:07.649960 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:07 crc kubenswrapper[4822]: E1010 06:26:07.650026 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:08 crc kubenswrapper[4822]: I1010 06:26:08.650162 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:08 crc kubenswrapper[4822]: E1010 06:26:08.650375 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.276351 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/1.log" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.277008 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/0.log" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.277070 4822 generic.go:334] "Generic (PLEG): container finished" podID="ec9c77cf-dd02-4e39-b204-9f6540406973" containerID="29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5" exitCode=1 Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.277124 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerDied","Data":"29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5"} Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.277203 4822 scope.go:117] "RemoveContainer" containerID="9aa0276ebbfe532177eaf90495fa58c0070d5b852d3f01bd69b5aea6a30d330b" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.278225 4822 scope.go:117] "RemoveContainer" containerID="29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5" Oct 10 06:26:09 crc kubenswrapper[4822]: E1010 06:26:09.278912 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5x2kt_openshift-multus(ec9c77cf-dd02-4e39-b204-9f6540406973)\"" pod="openshift-multus/multus-5x2kt" podUID="ec9c77cf-dd02-4e39-b204-9f6540406973" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.649580 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.649615 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:09 crc kubenswrapper[4822]: I1010 06:26:09.649921 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:09 crc kubenswrapper[4822]: E1010 06:26:09.649915 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:09 crc kubenswrapper[4822]: E1010 06:26:09.650060 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:09 crc kubenswrapper[4822]: E1010 06:26:09.650084 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:10 crc kubenswrapper[4822]: I1010 06:26:10.281094 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/1.log" Oct 10 06:26:10 crc kubenswrapper[4822]: I1010 06:26:10.650213 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:10 crc kubenswrapper[4822]: E1010 06:26:10.650427 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:11 crc kubenswrapper[4822]: I1010 06:26:11.649608 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:11 crc kubenswrapper[4822]: I1010 06:26:11.649631 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:11 crc kubenswrapper[4822]: I1010 06:26:11.649722 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:11 crc kubenswrapper[4822]: E1010 06:26:11.649976 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:11 crc kubenswrapper[4822]: E1010 06:26:11.650413 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:11 crc kubenswrapper[4822]: E1010 06:26:11.650762 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:12 crc kubenswrapper[4822]: I1010 06:26:12.650149 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:12 crc kubenswrapper[4822]: E1010 06:26:12.650349 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:13 crc kubenswrapper[4822]: E1010 06:26:13.644895 4822 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 10 06:26:13 crc kubenswrapper[4822]: I1010 06:26:13.649368 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:13 crc kubenswrapper[4822]: I1010 06:26:13.649426 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:13 crc kubenswrapper[4822]: I1010 06:26:13.649480 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:13 crc kubenswrapper[4822]: E1010 06:26:13.650381 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:13 crc kubenswrapper[4822]: E1010 06:26:13.650479 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:13 crc kubenswrapper[4822]: E1010 06:26:13.650572 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:13 crc kubenswrapper[4822]: E1010 06:26:13.742948 4822 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:26:14 crc kubenswrapper[4822]: I1010 06:26:14.649865 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:14 crc kubenswrapper[4822]: E1010 06:26:14.650106 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:15 crc kubenswrapper[4822]: I1010 06:26:15.650045 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:15 crc kubenswrapper[4822]: I1010 06:26:15.650067 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:15 crc kubenswrapper[4822]: E1010 06:26:15.651062 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:15 crc kubenswrapper[4822]: I1010 06:26:15.650204 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:15 crc kubenswrapper[4822]: E1010 06:26:15.651267 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:15 crc kubenswrapper[4822]: E1010 06:26:15.651405 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:16 crc kubenswrapper[4822]: I1010 06:26:16.649336 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:16 crc kubenswrapper[4822]: E1010 06:26:16.649541 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:17 crc kubenswrapper[4822]: I1010 06:26:17.649785 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:17 crc kubenswrapper[4822]: I1010 06:26:17.649916 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:17 crc kubenswrapper[4822]: E1010 06:26:17.650267 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:17 crc kubenswrapper[4822]: I1010 06:26:17.650288 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:17 crc kubenswrapper[4822]: E1010 06:26:17.650460 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:17 crc kubenswrapper[4822]: E1010 06:26:17.650491 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:17 crc kubenswrapper[4822]: I1010 06:26:17.651159 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:26:18 crc kubenswrapper[4822]: I1010 06:26:18.307661 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/3.log" Oct 10 06:26:18 crc kubenswrapper[4822]: I1010 06:26:18.310584 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerStarted","Data":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} Oct 10 06:26:18 crc kubenswrapper[4822]: I1010 06:26:18.311099 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:26:18 crc kubenswrapper[4822]: I1010 06:26:18.349424 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podStartSLOduration=104.349405299 podStartE2EDuration="1m44.349405299s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:18.348404268 +0000 UTC m=+125.443562514" watchObservedRunningTime="2025-10-10 06:26:18.349405299 +0000 UTC m=+125.444563495" Oct 10 06:26:18 crc kubenswrapper[4822]: I1010 06:26:18.502557 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-25l92"] Oct 10 06:26:18 crc kubenswrapper[4822]: I1010 06:26:18.502687 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:18 crc kubenswrapper[4822]: E1010 06:26:18.502789 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:18 crc kubenswrapper[4822]: E1010 06:26:18.745407 4822 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:26:19 crc kubenswrapper[4822]: I1010 06:26:19.649603 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:19 crc kubenswrapper[4822]: I1010 06:26:19.649606 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:19 crc kubenswrapper[4822]: E1010 06:26:19.650043 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:19 crc kubenswrapper[4822]: E1010 06:26:19.650098 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:19 crc kubenswrapper[4822]: I1010 06:26:19.649621 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:19 crc kubenswrapper[4822]: E1010 06:26:19.650175 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:20 crc kubenswrapper[4822]: I1010 06:26:20.649435 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:20 crc kubenswrapper[4822]: E1010 06:26:20.649556 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:21 crc kubenswrapper[4822]: I1010 06:26:21.649997 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:21 crc kubenswrapper[4822]: I1010 06:26:21.650023 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:21 crc kubenswrapper[4822]: I1010 06:26:21.650267 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:21 crc kubenswrapper[4822]: E1010 06:26:21.650213 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:21 crc kubenswrapper[4822]: E1010 06:26:21.650382 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:21 crc kubenswrapper[4822]: E1010 06:26:21.650531 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:22 crc kubenswrapper[4822]: I1010 06:26:22.649377 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:22 crc kubenswrapper[4822]: E1010 06:26:22.649600 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:23 crc kubenswrapper[4822]: I1010 06:26:23.650296 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:23 crc kubenswrapper[4822]: I1010 06:26:23.650363 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:23 crc kubenswrapper[4822]: I1010 06:26:23.650381 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:23 crc kubenswrapper[4822]: E1010 06:26:23.652784 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:23 crc kubenswrapper[4822]: E1010 06:26:23.652974 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:23 crc kubenswrapper[4822]: E1010 06:26:23.652671 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:23 crc kubenswrapper[4822]: E1010 06:26:23.745844 4822 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:26:24 crc kubenswrapper[4822]: I1010 06:26:24.650168 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:24 crc kubenswrapper[4822]: E1010 06:26:24.650357 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:24 crc kubenswrapper[4822]: I1010 06:26:24.650679 4822 scope.go:117] "RemoveContainer" containerID="29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5" Oct 10 06:26:25 crc kubenswrapper[4822]: I1010 06:26:25.335984 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/1.log" Oct 10 06:26:25 crc kubenswrapper[4822]: I1010 06:26:25.336043 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerStarted","Data":"e0cb5307c1b8e0c07caa219c2ed304d4abf8dfe9d92d267f95340e09899f88ad"} Oct 10 06:26:25 crc kubenswrapper[4822]: I1010 06:26:25.649420 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:25 crc kubenswrapper[4822]: I1010 06:26:25.649513 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:25 crc kubenswrapper[4822]: I1010 06:26:25.649535 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:25 crc kubenswrapper[4822]: E1010 06:26:25.650291 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:25 crc kubenswrapper[4822]: E1010 06:26:25.650378 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:25 crc kubenswrapper[4822]: E1010 06:26:25.650438 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:26 crc kubenswrapper[4822]: I1010 06:26:26.649960 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:26 crc kubenswrapper[4822]: E1010 06:26:26.650433 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:27 crc kubenswrapper[4822]: I1010 06:26:27.650004 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:27 crc kubenswrapper[4822]: E1010 06:26:27.650139 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:26:27 crc kubenswrapper[4822]: I1010 06:26:27.650009 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:27 crc kubenswrapper[4822]: I1010 06:26:27.650004 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:27 crc kubenswrapper[4822]: E1010 06:26:27.650246 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:26:27 crc kubenswrapper[4822]: E1010 06:26:27.650370 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:26:28 crc kubenswrapper[4822]: I1010 06:26:28.650179 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:28 crc kubenswrapper[4822]: E1010 06:26:28.650340 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-25l92" podUID="8a5c431a-2c94-41ca-aba2-c7a04c4908db" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.649879 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.649914 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.649897 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.652392 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.652838 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.652987 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 10 06:26:29 crc kubenswrapper[4822]: I1010 06:26:29.652996 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 10 06:26:30 crc kubenswrapper[4822]: I1010 06:26:30.649545 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:30 crc kubenswrapper[4822]: I1010 06:26:30.651877 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 10 06:26:30 crc kubenswrapper[4822]: I1010 06:26:30.652127 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.335477 4822 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.388928 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h59tt"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.389915 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.390038 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kznjd"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.391187 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.392780 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9kf5"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.393304 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.393923 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.394107 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.396840 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.397279 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.402022 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fwsh4"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.403343 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.405521 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.405679 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.405765 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.405685 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gfgpk"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406052 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406143 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406436 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406517 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406553 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406647 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406673 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406774 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406822 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406894 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.406935 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407029 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407149 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407250 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407358 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407394 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407403 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407493 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407530 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407674 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407934 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.407368 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408174 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408242 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408297 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408351 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408420 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408469 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408548 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408631 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tr8l7"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.408782 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.409013 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.409123 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.409354 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.409387 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.422499 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.443277 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.443545 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.443605 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.444136 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.444310 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.444353 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.445096 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.445471 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446216 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.445494 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446707 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446872 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.447394 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.445552 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446608 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446068 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.447618 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446423 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446497 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446554 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.447943 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.446629 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.447770 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.448098 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.450205 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kvjlx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.450363 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.450612 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.450671 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.452135 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.452257 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.453214 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.455597 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.458553 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.458836 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459020 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459128 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459183 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459367 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459439 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pv22z"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459400 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459400 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.459082 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.460037 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.461612 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.461859 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.462006 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.462256 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.462319 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.462600 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.462671 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.462975 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.463035 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9dtp5"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.463234 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.463891 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.469886 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.470577 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.470962 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.471657 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.472396 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h59tt"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.472559 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.476166 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.476558 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.476851 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.477046 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.477644 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.477956 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.479614 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dpxkz"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.480175 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.481853 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.481863 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.482261 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.482447 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.518906 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tvjnx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.519776 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.521816 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.528825 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62mh\" (UniqueName: \"kubernetes.io/projected/9d368e81-49c5-4a8c-8903-d393afe2e509-kube-api-access-t62mh\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.528883 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a67f4-5fa9-400f-9877-784faffe19fd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhr9f\" (UID: \"cb0a67f4-5fa9-400f-9877-784faffe19fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.528920 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-config\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.528944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpwq\" (UniqueName: \"kubernetes.io/projected/96d524e6-85da-48fc-a1b7-a56c007380e4-kube-api-access-7vpwq\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.528976 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-config\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.528996 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-oauth-serving-cert\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.522704 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.522870 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.522919 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.525825 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.526048 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.526126 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.526189 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.526331 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.526533 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.526643 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.543354 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544050 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544115 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d368e81-49c5-4a8c-8903-d393afe2e509-config\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544142 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9d368e81-49c5-4a8c-8903-d393afe2e509-machine-approver-tls\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544162 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544183 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5sjz\" (UniqueName: \"kubernetes.io/projected/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-kube-api-access-b5sjz\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544217 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-serving-cert\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544237 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-config\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544257 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d524e6-85da-48fc-a1b7-a56c007380e4-serving-cert\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544275 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28dw\" (UniqueName: \"kubernetes.io/projected/0319e3d8-e3a7-499e-962a-efeb8bc2e3a3-kube-api-access-s28dw\") pod \"dns-operator-744455d44c-9dtp5\" (UID: \"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544294 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05179c47-551e-4445-bf6e-1f328d5f024c-serving-cert\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544310 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-client-ca\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544384 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-audit-policies\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544406 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-etcd-client\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544425 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tkw\" (UniqueName: \"kubernetes.io/projected/6e88bcbb-f0a7-4837-96e0-6ced47adb39a-kube-api-access-72tkw\") pod \"downloads-7954f5f757-tr8l7\" (UID: \"6e88bcbb-f0a7-4837-96e0-6ced47adb39a\") " pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544453 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-encryption-config\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544472 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96d524e6-85da-48fc-a1b7-a56c007380e4-trusted-ca\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544492 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-config\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544519 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0319e3d8-e3a7-499e-962a-efeb8bc2e3a3-metrics-tls\") pod \"dns-operator-744455d44c-9dtp5\" (UID: \"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544553 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544573 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-images\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544599 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rclk4\" (UniqueName: \"kubernetes.io/projected/05179c47-551e-4445-bf6e-1f328d5f024c-kube-api-access-rclk4\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544624 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-node-pullsecrets\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544646 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544673 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d368e81-49c5-4a8c-8903-d393afe2e509-auth-proxy-config\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544711 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-encryption-config\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544737 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-oauth-config\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544758 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-serving-cert\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544779 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-audit-dir\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544819 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-serving-cert\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544853 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-etcd-client\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544872 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmx68\" (UniqueName: \"kubernetes.io/projected/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-kube-api-access-bmx68\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544890 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbrp\" (UniqueName: \"kubernetes.io/projected/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-kube-api-access-fzbrp\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544910 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9rs\" (UniqueName: \"kubernetes.io/projected/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-kube-api-access-8g9rs\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544945 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdsv\" (UniqueName: \"kubernetes.io/projected/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-kube-api-access-5tdsv\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544968 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.544990 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-audit-dir\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545011 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-client-ca\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545032 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-service-ca\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545066 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmrj\" (UniqueName: \"kubernetes.io/projected/828ae265-a267-4af7-9893-671d038878b7-kube-api-access-qbmrj\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545087 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545108 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-config\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545134 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545151 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-trusted-ca-bundle\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545173 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr8jn\" (UniqueName: \"kubernetes.io/projected/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-kube-api-access-fr8jn\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545195 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d524e6-85da-48fc-a1b7-a56c007380e4-config\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545227 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545250 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-image-import-ca\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545270 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-service-ca-bundle\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545292 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-etcd-serving-ca\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545312 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-serving-cert\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545330 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545353 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfp7s\" (UniqueName: \"kubernetes.io/projected/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-kube-api-access-tfp7s\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545373 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9t9d\" (UniqueName: \"kubernetes.io/projected/cb0a67f4-5fa9-400f-9877-784faffe19fd-kube-api-access-v9t9d\") pod \"cluster-samples-operator-665b6dd947-nhr9f\" (UID: \"cb0a67f4-5fa9-400f-9877-784faffe19fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545392 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-audit\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545412 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828ae265-a267-4af7-9893-671d038878b7-serving-cert\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545439 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-config\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.545459 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-serving-cert\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.548958 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.549018 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.549296 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.550019 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.550064 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.551140 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.551852 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.554471 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.555136 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.555455 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdbjq"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.556078 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.557306 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.557730 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.558303 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.566669 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.567020 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.568821 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.569138 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.569347 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.569577 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.570177 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.570501 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.571104 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.571682 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvtdr"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.572182 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.572185 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.573878 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-s5mxk"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.574363 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.574456 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.575464 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nd8fx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.575486 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.576661 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.576830 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.577153 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.582387 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.582898 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.583248 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.583320 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.584537 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.584593 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gh994"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.585392 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.585864 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.586116 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.587000 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.587059 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fwsh4"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.588421 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9kf5"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.589675 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hq4fv"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.590357 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.591654 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.592137 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.604228 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.607860 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.609267 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.610027 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tr8l7"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.612304 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.612787 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.613602 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.616412 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.624129 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvtdr"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.626880 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.628750 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kvjlx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.630400 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kznjd"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.632143 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pv22z"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.632824 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.634419 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.636544 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.637759 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gfgpk"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.639135 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.640223 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.644330 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.645914 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9dtp5"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646275 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-audit-dir\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646333 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646368 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-apiservice-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646416 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-client-ca\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646441 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-service-ca\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646464 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-audit-dir\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646461 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aed35222-5301-4df9-8f23-16816ebe4871-etcd-client\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646626 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646684 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b18ce54d-fc7a-46d0-a829-cb94946df57a-tmpfs\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646709 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld55k\" (UniqueName: \"kubernetes.io/projected/206453dd-5793-4461-be49-6d3de82b1431-kube-api-access-ld55k\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646748 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-config\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646771 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2k8w\" (UniqueName: \"kubernetes.io/projected/dc8a3486-2bb5-49fd-99ed-09a9e743932c-kube-api-access-v2k8w\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646795 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-trusted-ca-bundle\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646848 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7px2t\" (UniqueName: \"kubernetes.io/projected/ce67030c-749e-44e9-b900-843b66b80829-kube-api-access-7px2t\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646869 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7v5b\" (UniqueName: \"kubernetes.io/projected/3d6d81c3-cd2f-4848-b9c5-2571af33e758-kube-api-access-w7v5b\") pod \"package-server-manager-789f6589d5-7trts\" (UID: \"3d6d81c3-cd2f-4848-b9c5-2571af33e758\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646895 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.646976 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647008 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647089 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647187 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce67030c-749e-44e9-b900-843b66b80829-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647222 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxjm\" (UniqueName: \"kubernetes.io/projected/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-kube-api-access-vpxjm\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-webhook-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647269 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-audit\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647290 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828ae265-a267-4af7-9893-671d038878b7-serving-cert\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647308 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9t9d\" (UniqueName: \"kubernetes.io/projected/cb0a67f4-5fa9-400f-9877-784faffe19fd-kube-api-access-v9t9d\") pod \"cluster-samples-operator-665b6dd947-nhr9f\" (UID: \"cb0a67f4-5fa9-400f-9877-784faffe19fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647331 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-policies\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647352 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-etcd-service-ca\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647375 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647395 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-config\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647510 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647573 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-serving-cert\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647637 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-client-ca\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647760 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t62mh\" (UniqueName: \"kubernetes.io/projected/9d368e81-49c5-4a8c-8903-d393afe2e509-kube-api-access-t62mh\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647849 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-config\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647888 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f25f417d-9d4b-426b-949e-b724290eb645-proxy-tls\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.647939 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648340 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-audit\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648439 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-trusted-ca-bundle\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648555 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-config\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648723 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648732 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-config\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648755 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-config\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648766 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdbjq"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-config\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648839 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648914 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648951 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3700224-4110-4020-a014-a904f6710ce2-serving-cert\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.648983 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23469a60-87a5-4400-a80b-2a3283833474-signing-cabundle\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649068 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-config\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649110 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-metrics-certs\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649182 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx7cj\" (UniqueName: \"kubernetes.io/projected/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-kube-api-access-vx7cj\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649238 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5sjz\" (UniqueName: \"kubernetes.io/projected/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-kube-api-access-b5sjz\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649290 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df9d10fb-c826-49b2-bc26-487b5a02822c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649319 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28dw\" (UniqueName: \"kubernetes.io/projected/0319e3d8-e3a7-499e-962a-efeb8bc2e3a3-kube-api-access-s28dw\") pod \"dns-operator-744455d44c-9dtp5\" (UID: \"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649345 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05179c47-551e-4445-bf6e-1f328d5f024c-serving-cert\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649367 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649432 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce67030c-749e-44e9-b900-843b66b80829-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649456 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmqv\" (UniqueName: \"kubernetes.io/projected/f25f417d-9d4b-426b-949e-b724290eb645-kube-api-access-5qmqv\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649529 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-client-ca\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649552 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmw5h\" (UniqueName: \"kubernetes.io/projected/23469a60-87a5-4400-a80b-2a3283833474-kube-api-access-qmw5h\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649646 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-audit-policies\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649682 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-etcd-client\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649709 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72tkw\" (UniqueName: \"kubernetes.io/projected/6e88bcbb-f0a7-4837-96e0-6ced47adb39a-kube-api-access-72tkw\") pod \"downloads-7954f5f757-tr8l7\" (UID: \"6e88bcbb-f0a7-4837-96e0-6ced47adb39a\") " pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-encryption-config\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649863 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-config\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649888 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23469a60-87a5-4400-a80b-2a3283833474-signing-key\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649915 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-etcd-ca\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.649938 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-images\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.650012 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-stats-auth\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.650040 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.650066 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80899bb7-40c7-4bb1-8a61-d620fffdc036-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.650093 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d6d81c3-cd2f-4848-b9c5-2571af33e758-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7trts\" (UID: \"3d6d81c3-cd2f-4848-b9c5-2571af33e758\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.650403 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-config\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651143 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-service-ca\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651392 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-audit-policies\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651397 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-node-pullsecrets\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651472 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rclk4\" (UniqueName: \"kubernetes.io/projected/05179c47-551e-4445-bf6e-1f328d5f024c-kube-api-access-rclk4\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651477 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-node-pullsecrets\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651528 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ttl\" (UniqueName: \"kubernetes.io/projected/714a8c0a-65f3-46b2-b40b-f4faa4aeb505-kube-api-access-n7ttl\") pod \"migrator-59844c95c7-bprpb\" (UID: \"714a8c0a-65f3-46b2-b40b-f4faa4aeb505\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651689 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651855 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-config\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.651950 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652005 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-config\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652040 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652099 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-encryption-config\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652161 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-oauth-config\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652215 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-serving-cert\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652327 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652358 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-etcd-client\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652399 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9rs\" (UniqueName: \"kubernetes.io/projected/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-kube-api-access-8g9rs\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652419 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvh7d\" (UniqueName: \"kubernetes.io/projected/2df5f09d-0d1c-40cf-9041-695d831d552d-kube-api-access-mvh7d\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652440 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdsv\" (UniqueName: \"kubernetes.io/projected/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-kube-api-access-5tdsv\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652461 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.652845 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.653546 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmrj\" (UniqueName: \"kubernetes.io/projected/828ae265-a267-4af7-9893-671d038878b7-kube-api-access-qbmrj\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.653631 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.653904 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.653940 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr8jn\" (UniqueName: \"kubernetes.io/projected/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-kube-api-access-fr8jn\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.653970 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d524e6-85da-48fc-a1b7-a56c007380e4-config\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654021 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62jm\" (UniqueName: \"kubernetes.io/projected/e3700224-4110-4020-a014-a904f6710ce2-kube-api-access-p62jm\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654048 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df9d10fb-c826-49b2-bc26-487b5a02822c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654072 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzqrz\" (UniqueName: \"kubernetes.io/projected/a176335f-a8bb-476a-bc6d-540be238a200-kube-api-access-dzqrz\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654098 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-image-import-ca\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654135 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-service-ca-bundle\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654153 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-serving-cert\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654172 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654190 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfp7s\" (UniqueName: \"kubernetes.io/projected/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-kube-api-access-tfp7s\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654390 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654430 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-client-ca\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.654966 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-serving-cert\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.655178 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-image-import-ca\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.655283 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d524e6-85da-48fc-a1b7-a56c007380e4-config\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.655588 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.655698 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-service-ca-bundle\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656152 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-s5mxk"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656185 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656269 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-etcd-client\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656588 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-etcd-serving-ca\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656630 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-etcd-client\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656709 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-dir\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656785 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc8a3486-2bb5-49fd-99ed-09a9e743932c-service-ca-bundle\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656834 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.656898 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a67f4-5fa9-400f-9877-784faffe19fd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhr9f\" (UID: \"cb0a67f4-5fa9-400f-9877-784faffe19fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657130 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-etcd-serving-ca\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657232 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpwq\" (UniqueName: \"kubernetes.io/projected/96d524e6-85da-48fc-a1b7-a56c007380e4-kube-api-access-7vpwq\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657310 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed35222-5301-4df9-8f23-16816ebe4871-serving-cert\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657325 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657418 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-encryption-config\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657435 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657471 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-srv-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657506 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-proxy-tls\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657554 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657583 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d368e81-49c5-4a8c-8903-d393afe2e509-config\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657648 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-oauth-serving-cert\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657666 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657677 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9d368e81-49c5-4a8c-8903-d393afe2e509-machine-approver-tls\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657705 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5762cd1d-9023-42d1-908e-a2da9cf7f052-trusted-ca\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657731 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657774 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657849 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657886 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5762cd1d-9023-42d1-908e-a2da9cf7f052-metrics-tls\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657920 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln56l\" (UniqueName: \"kubernetes.io/projected/5762cd1d-9023-42d1-908e-a2da9cf7f052-kube-api-access-ln56l\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657949 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d524e6-85da-48fc-a1b7-a56c007380e4-serving-cert\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.657992 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-serving-cert\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658011 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-config\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658034 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658118 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d368e81-49c5-4a8c-8903-d393afe2e509-config\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658584 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-serving-cert\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658686 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-oauth-serving-cert\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658774 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-profile-collector-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658886 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96d524e6-85da-48fc-a1b7-a56c007380e4-trusted-ca\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658921 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.658982 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0319e3d8-e3a7-499e-962a-efeb8bc2e3a3-metrics-tls\") pod \"dns-operator-744455d44c-9dtp5\" (UID: \"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659009 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-images\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659038 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9d10fb-c826-49b2-bc26-487b5a02822c-config\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659066 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-default-certificate\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659090 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5762cd1d-9023-42d1-908e-a2da9cf7f052-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659114 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce67030c-749e-44e9-b900-843b66b80829-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659139 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3700224-4110-4020-a014-a904f6710ce2-config\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659170 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d368e81-49c5-4a8c-8903-d393afe2e509-auth-proxy-config\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8vv\" (UniqueName: \"kubernetes.io/projected/80899bb7-40c7-4bb1-8a61-d620fffdc036-kube-api-access-gd8vv\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659259 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80899bb7-40c7-4bb1-8a61-d620fffdc036-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659285 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-serving-cert\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659311 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-audit-dir\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858l2\" (UniqueName: \"kubernetes.io/projected/b18ce54d-fc7a-46d0-a829-cb94946df57a-kube-api-access-858l2\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659369 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659412 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmx68\" (UniqueName: \"kubernetes.io/projected/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-kube-api-access-bmx68\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659439 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbrp\" (UniqueName: \"kubernetes.io/projected/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-kube-api-access-fzbrp\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659467 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pwms\" (UniqueName: \"kubernetes.io/projected/aed35222-5301-4df9-8f23-16816ebe4871-kube-api-access-8pwms\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659689 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659725 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-serving-cert\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659926 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05179c47-551e-4445-bf6e-1f328d5f024c-serving-cert\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.660015 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96d524e6-85da-48fc-a1b7-a56c007380e4-trusted-ca\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.660105 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-audit-dir\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.660229 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/828ae265-a267-4af7-9893-671d038878b7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.659441 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-config\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.660573 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-images\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661208 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a67f4-5fa9-400f-9877-784faffe19fd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nhr9f\" (UID: \"cb0a67f4-5fa9-400f-9877-784faffe19fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661253 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-serving-cert\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661268 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661572 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-oauth-config\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661692 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661772 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9d368e81-49c5-4a8c-8903-d393afe2e509-auth-proxy-config\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.661849 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d524e6-85da-48fc-a1b7-a56c007380e4-serving-cert\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.662772 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-encryption-config\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.662839 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.663367 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9d368e81-49c5-4a8c-8903-d393afe2e509-machine-approver-tls\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.663426 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0319e3d8-e3a7-499e-962a-efeb8bc2e3a3-metrics-tls\") pod \"dns-operator-744455d44c-9dtp5\" (UID: \"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.664143 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-serving-cert\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.665945 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ftmzs"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.667172 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828ae265-a267-4af7-9893-671d038878b7-serving-cert\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.667288 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nd8fx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.667464 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.667793 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.669402 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.670910 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p2d4j"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.671690 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.671997 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.678568 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.681436 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tvjnx"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.692154 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gh994"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.692476 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.692849 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.694917 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.698600 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ftmzs"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.702306 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p2d4j"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.703986 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-r2qgn"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.704878 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.705476 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r2qgn"] Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.713661 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.732942 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.752600 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760259 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f25f417d-9d4b-426b-949e-b724290eb645-proxy-tls\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760302 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760334 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760365 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-config\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760391 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760417 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3700224-4110-4020-a014-a904f6710ce2-serving-cert\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760442 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23469a60-87a5-4400-a80b-2a3283833474-signing-cabundle\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760478 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-metrics-certs\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760506 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx7cj\" (UniqueName: \"kubernetes.io/projected/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-kube-api-access-vx7cj\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760553 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df9d10fb-c826-49b2-bc26-487b5a02822c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760602 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760648 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce67030c-749e-44e9-b900-843b66b80829-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760678 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmqv\" (UniqueName: \"kubernetes.io/projected/f25f417d-9d4b-426b-949e-b724290eb645-kube-api-access-5qmqv\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760719 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmw5h\" (UniqueName: \"kubernetes.io/projected/23469a60-87a5-4400-a80b-2a3283833474-kube-api-access-qmw5h\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760757 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23469a60-87a5-4400-a80b-2a3283833474-signing-key\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760783 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-images\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760835 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-etcd-ca\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760863 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760895 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80899bb7-40c7-4bb1-8a61-d620fffdc036-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760922 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d6d81c3-cd2f-4848-b9c5-2571af33e758-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7trts\" (UID: \"3d6d81c3-cd2f-4848-b9c5-2571af33e758\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760951 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-stats-auth\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.760986 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ttl\" (UniqueName: \"kubernetes.io/projected/714a8c0a-65f3-46b2-b40b-f4faa4aeb505-kube-api-access-n7ttl\") pod \"migrator-59844c95c7-bprpb\" (UID: \"714a8c0a-65f3-46b2-b40b-f4faa4aeb505\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761014 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761042 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-config\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761080 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761107 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761147 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvh7d\" (UniqueName: \"kubernetes.io/projected/2df5f09d-0d1c-40cf-9041-695d831d552d-kube-api-access-mvh7d\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761185 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761235 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62jm\" (UniqueName: \"kubernetes.io/projected/e3700224-4110-4020-a014-a904f6710ce2-kube-api-access-p62jm\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761261 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df9d10fb-c826-49b2-bc26-487b5a02822c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzqrz\" (UniqueName: \"kubernetes.io/projected/a176335f-a8bb-476a-bc6d-540be238a200-kube-api-access-dzqrz\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761339 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-dir\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761368 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc8a3486-2bb5-49fd-99ed-09a9e743932c-service-ca-bundle\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761406 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-proxy-tls\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761450 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed35222-5301-4df9-8f23-16816ebe4871-serving-cert\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761474 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-srv-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761500 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761545 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5762cd1d-9023-42d1-908e-a2da9cf7f052-trusted-ca\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761586 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761631 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761676 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5762cd1d-9023-42d1-908e-a2da9cf7f052-metrics-tls\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761686 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80899bb7-40c7-4bb1-8a61-d620fffdc036-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761708 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln56l\" (UniqueName: \"kubernetes.io/projected/5762cd1d-9023-42d1-908e-a2da9cf7f052-kube-api-access-ln56l\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.761735 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.762497 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.763136 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.763325 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-dir\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.763817 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-profile-collector-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.763883 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.763937 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9d10fb-c826-49b2-bc26-487b5a02822c-config\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.763975 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5762cd1d-9023-42d1-908e-a2da9cf7f052-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764007 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce67030c-749e-44e9-b900-843b66b80829-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764040 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3700224-4110-4020-a014-a904f6710ce2-config\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764070 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764079 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-default-certificate\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764140 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80899bb7-40c7-4bb1-8a61-d620fffdc036-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764145 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764173 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8vv\" (UniqueName: \"kubernetes.io/projected/80899bb7-40c7-4bb1-8a61-d620fffdc036-kube-api-access-gd8vv\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764151 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764231 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pwms\" (UniqueName: \"kubernetes.io/projected/aed35222-5301-4df9-8f23-16816ebe4871-kube-api-access-8pwms\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764304 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858l2\" (UniqueName: \"kubernetes.io/projected/b18ce54d-fc7a-46d0-a829-cb94946df57a-kube-api-access-858l2\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764333 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764385 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-apiservice-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764418 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aed35222-5301-4df9-8f23-16816ebe4871-etcd-client\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764452 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2k8w\" (UniqueName: \"kubernetes.io/projected/dc8a3486-2bb5-49fd-99ed-09a9e743932c-kube-api-access-v2k8w\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764482 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764511 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b18ce54d-fc7a-46d0-a829-cb94946df57a-tmpfs\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764534 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld55k\" (UniqueName: \"kubernetes.io/projected/206453dd-5793-4461-be49-6d3de82b1431-kube-api-access-ld55k\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764558 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7px2t\" (UniqueName: \"kubernetes.io/projected/ce67030c-749e-44e9-b900-843b66b80829-kube-api-access-7px2t\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764581 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7v5b\" (UniqueName: \"kubernetes.io/projected/3d6d81c3-cd2f-4848-b9c5-2571af33e758-kube-api-access-w7v5b\") pod \"package-server-manager-789f6589d5-7trts\" (UID: \"3d6d81c3-cd2f-4848-b9c5-2571af33e758\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764594 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5762cd1d-9023-42d1-908e-a2da9cf7f052-trusted-ca\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764605 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.764992 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765029 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce67030c-749e-44e9-b900-843b66b80829-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765059 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-webhook-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765090 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxjm\" (UniqueName: \"kubernetes.io/projected/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-kube-api-access-vpxjm\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765134 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-policies\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765166 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-etcd-service-ca\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765194 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765328 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765494 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce67030c-749e-44e9-b900-843b66b80829-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.765672 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b18ce54d-fc7a-46d0-a829-cb94946df57a-tmpfs\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.766479 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5762cd1d-9023-42d1-908e-a2da9cf7f052-metrics-tls\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.766537 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-metrics-certs\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.766660 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-policies\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.767041 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.767463 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.767845 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.768988 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-default-certificate\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.769216 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.769462 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80899bb7-40c7-4bb1-8a61-d620fffdc036-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.769591 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.769766 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.770169 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce67030c-749e-44e9-b900-843b66b80829-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.770461 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.773083 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.785802 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc8a3486-2bb5-49fd-99ed-09a9e743932c-stats-auth\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.794016 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.812921 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.814174 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc8a3486-2bb5-49fd-99ed-09a9e743932c-service-ca-bundle\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.833690 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.873227 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.893843 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.912590 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.933037 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.954289 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.973832 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 10 06:26:33 crc kubenswrapper[4822]: I1010 06:26:33.993426 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.013124 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.033914 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.044930 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.052998 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.072719 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.081982 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-config\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.094138 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.096536 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-etcd-service-ca\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.113459 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.134210 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.152310 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.157076 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed35222-5301-4df9-8f23-16816ebe4871-serving-cert\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.173346 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.177206 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aed35222-5301-4df9-8f23-16816ebe4871-etcd-client\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.193032 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.201434 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-config\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.213318 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.222173 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aed35222-5301-4df9-8f23-16816ebe4871-etcd-ca\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.232895 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.253215 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.256325 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-proxy-tls\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.272314 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.292024 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.296626 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df9d10fb-c826-49b2-bc26-487b5a02822c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.313178 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.332411 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.352343 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.354870 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9d10fb-c826-49b2-bc26-487b5a02822c-config\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.372542 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.392119 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.413833 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.432608 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.453026 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.473863 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.492684 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.514095 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.524504 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d6d81c3-cd2f-4848-b9c5-2571af33e758-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7trts\" (UID: \"3d6d81c3-cd2f-4848-b9c5-2571af33e758\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.533752 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.552381 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.558686 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.573169 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.591179 4822 request.go:700] Waited for 1.018755445s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.593354 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.620695 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.625783 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.633929 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.653346 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.672500 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.693363 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.712300 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.732623 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.751995 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761238 4822 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761306 4822 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761243 4822 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761341 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23469a60-87a5-4400-a80b-2a3283833474-signing-key podName:23469a60-87a5-4400-a80b-2a3283833474 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.261318949 +0000 UTC m=+142.356477155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/23469a60-87a5-4400-a80b-2a3283833474-signing-key") pod "service-ca-9c57cc56f-nd8fx" (UID: "23469a60-87a5-4400-a80b-2a3283833474") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761257 4822 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761365 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-images podName:f25f417d-9d4b-426b-949e-b724290eb645 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.2613563 +0000 UTC m=+142.356514516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-images") pod "machine-config-operator-74547568cd-gh994" (UID: "f25f417d-9d4b-426b-949e-b724290eb645") : failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761387 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25f417d-9d4b-426b-949e-b724290eb645-proxy-tls podName:f25f417d-9d4b-426b-949e-b724290eb645 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.261377381 +0000 UTC m=+142.356535587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f25f417d-9d4b-426b-949e-b724290eb645-proxy-tls") pod "machine-config-operator-74547568cd-gh994" (UID: "f25f417d-9d4b-426b-949e-b724290eb645") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761405 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23469a60-87a5-4400-a80b-2a3283833474-signing-cabundle podName:23469a60-87a5-4400-a80b-2a3283833474 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.261396632 +0000 UTC m=+142.356554838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/23469a60-87a5-4400-a80b-2a3283833474-signing-cabundle") pod "service-ca-9c57cc56f-nd8fx" (UID: "23469a60-87a5-4400-a80b-2a3283833474") : failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761539 4822 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.761582 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3700224-4110-4020-a014-a904f6710ce2-serving-cert podName:e3700224-4110-4020-a014-a904f6710ce2 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.261564326 +0000 UTC m=+142.356722522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e3700224-4110-4020-a014-a904f6710ce2-serving-cert") pod "service-ca-operator-777779d784-p5bvv" (UID: "e3700224-4110-4020-a014-a904f6710ce2") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.763659 4822 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.763696 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-srv-cert podName:206453dd-5793-4461-be49-6d3de82b1431 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.263685878 +0000 UTC m=+142.358844074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-srv-cert") pod "catalog-operator-68c6474976-9rx2j" (UID: "206453dd-5793-4461-be49-6d3de82b1431") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.763711 4822 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.763751 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume podName:2df5f09d-0d1c-40cf-9041-695d831d552d nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.26374219 +0000 UTC m=+142.358900376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume") pod "collect-profiles-29334615-ssmbx" (UID: "2df5f09d-0d1c-40cf-9041-695d831d552d") : failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.764910 4822 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.764926 4822 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.764935 4822 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.764961 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3700224-4110-4020-a014-a904f6710ce2-config podName:e3700224-4110-4020-a014-a904f6710ce2 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.264948625 +0000 UTC m=+142.360106911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e3700224-4110-4020-a014-a904f6710ce2-config") pod "service-ca-operator-777779d784-p5bvv" (UID: "e3700224-4110-4020-a014-a904f6710ce2") : failed to sync configmap cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.764985 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume podName:2df5f09d-0d1c-40cf-9041-695d831d552d nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.264971296 +0000 UTC m=+142.360129502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume") pod "collect-profiles-29334615-ssmbx" (UID: "2df5f09d-0d1c-40cf-9041-695d831d552d") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.765001 4822 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.765043 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-profile-collector-cert podName:206453dd-5793-4461-be49-6d3de82b1431 nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.265014627 +0000 UTC m=+142.360172873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-profile-collector-cert") pod "catalog-operator-68c6474976-9rx2j" (UID: "206453dd-5793-4461-be49-6d3de82b1431") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.765069 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-apiservice-cert podName:b18ce54d-fc7a-46d0-a829-cb94946df57a nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.265058048 +0000 UTC m=+142.360216474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-apiservice-cert") pod "packageserver-d55dfcdfc-pvwp8" (UID: "b18ce54d-fc7a-46d0-a829-cb94946df57a") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.765266 4822 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: E1010 06:26:34.765315 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-webhook-cert podName:b18ce54d-fc7a-46d0-a829-cb94946df57a nodeName:}" failed. No retries permitted until 2025-10-10 06:26:35.265303885 +0000 UTC m=+142.360462091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-webhook-cert") pod "packageserver-d55dfcdfc-pvwp8" (UID: "b18ce54d-fc7a-46d0-a829-cb94946df57a") : failed to sync secret cache: timed out waiting for the condition Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.772014 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.791483 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.811865 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.832267 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.852570 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.872689 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.892484 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.912961 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.932769 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.953279 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.973281 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 10 06:26:34 crc kubenswrapper[4822]: I1010 06:26:34.992616 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.012911 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.033915 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.052828 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.072607 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.092450 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.112734 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.132984 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.152826 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.173263 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.192652 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.213203 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.272367 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9t9d\" (UniqueName: \"kubernetes.io/projected/cb0a67f4-5fa9-400f-9877-784faffe19fd-kube-api-access-v9t9d\") pod \"cluster-samples-operator-665b6dd947-nhr9f\" (UID: \"cb0a67f4-5fa9-400f-9877-784faffe19fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.288218 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62mh\" (UniqueName: \"kubernetes.io/projected/9d368e81-49c5-4a8c-8903-d393afe2e509-kube-api-access-t62mh\") pod \"machine-approver-56656f9798-mfzbs\" (UID: \"9d368e81-49c5-4a8c-8903-d393afe2e509\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289372 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3700224-4110-4020-a014-a904f6710ce2-serving-cert\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289409 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23469a60-87a5-4400-a80b-2a3283833474-signing-cabundle\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289478 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23469a60-87a5-4400-a80b-2a3283833474-signing-key\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289502 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-images\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289610 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-srv-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289638 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289663 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-profile-collector-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289682 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289713 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3700224-4110-4020-a014-a904f6710ce2-config\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289767 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-apiservice-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289827 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-webhook-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.289850 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f25f417d-9d4b-426b-949e-b724290eb645-proxy-tls\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.290301 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23469a60-87a5-4400-a80b-2a3283833474-signing-cabundle\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.290821 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f25f417d-9d4b-426b-949e-b724290eb645-images\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.291624 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3700224-4110-4020-a014-a904f6710ce2-config\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.292252 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.293605 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3700224-4110-4020-a014-a904f6710ce2-serving-cert\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.293985 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-webhook-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.294265 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-profile-collector-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.295234 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.295595 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/206453dd-5793-4461-be49-6d3de82b1431-srv-cert\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.296479 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b18ce54d-fc7a-46d0-a829-cb94946df57a-apiservice-cert\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.297681 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f25f417d-9d4b-426b-949e-b724290eb645-proxy-tls\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.298457 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23469a60-87a5-4400-a80b-2a3283833474-signing-key\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.305947 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tkw\" (UniqueName: \"kubernetes.io/projected/6e88bcbb-f0a7-4837-96e0-6ced47adb39a-kube-api-access-72tkw\") pod \"downloads-7954f5f757-tr8l7\" (UID: \"6e88bcbb-f0a7-4837-96e0-6ced47adb39a\") " pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.330670 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5sjz\" (UniqueName: \"kubernetes.io/projected/741dd6ab-d5b3-421f-94d3-de6bd19c2f86-kube-api-access-b5sjz\") pod \"openshift-apiserver-operator-796bbdcf4f-sg7rz\" (UID: \"741dd6ab-d5b3-421f-94d3-de6bd19c2f86\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.346965 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28dw\" (UniqueName: \"kubernetes.io/projected/0319e3d8-e3a7-499e-962a-efeb8bc2e3a3-kube-api-access-s28dw\") pod \"dns-operator-744455d44c-9dtp5\" (UID: \"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3\") " pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.368520 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.373376 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rclk4\" (UniqueName: \"kubernetes.io/projected/05179c47-551e-4445-bf6e-1f328d5f024c-kube-api-access-rclk4\") pod \"route-controller-manager-6576b87f9c-wfs5s\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.383175 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.387633 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9rs\" (UniqueName: \"kubernetes.io/projected/28d3fed7-bb74-4e55-a788-e354b2f0cd5c-kube-api-access-8g9rs\") pod \"openshift-config-operator-7777fb866f-n9rf4\" (UID: \"28d3fed7-bb74-4e55-a788-e354b2f0cd5c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.390317 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.408765 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdsv\" (UniqueName: \"kubernetes.io/projected/2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d-kube-api-access-5tdsv\") pod \"apiserver-76f77b778f-kznjd\" (UID: \"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d\") " pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.413793 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.429676 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmrj\" (UniqueName: \"kubernetes.io/projected/828ae265-a267-4af7-9893-671d038878b7-kube-api-access-qbmrj\") pod \"authentication-operator-69f744f599-fwsh4\" (UID: \"828ae265-a267-4af7-9893-671d038878b7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.433689 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.440154 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.448384 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr8jn\" (UniqueName: \"kubernetes.io/projected/f6bbb19f-3429-4af5-a28a-a0d0815f8ff3-kube-api-access-fr8jn\") pod \"apiserver-7bbb656c7d-p8fht\" (UID: \"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.472129 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfp7s\" (UniqueName: \"kubernetes.io/projected/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-kube-api-access-tfp7s\") pod \"console-f9d7485db-kvjlx\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.490240 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpwq\" (UniqueName: \"kubernetes.io/projected/96d524e6-85da-48fc-a1b7-a56c007380e4-kube-api-access-7vpwq\") pod \"console-operator-58897d9998-gfgpk\" (UID: \"96d524e6-85da-48fc-a1b7-a56c007380e4\") " pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.514459 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmx68\" (UniqueName: \"kubernetes.io/projected/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-kube-api-access-bmx68\") pod \"controller-manager-879f6c89f-z9kf5\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.527979 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbrp\" (UniqueName: \"kubernetes.io/projected/27d8f9ac-f418-46b8-9f7a-8bfc8dde1755-kube-api-access-fzbrp\") pod \"machine-api-operator-5694c8668f-h59tt\" (UID: \"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.533974 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.548813 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.556765 4822 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.563894 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.572992 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.582894 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.594008 4822 request.go:700] Waited for 1.922027049s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.596882 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.612007 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.620508 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4"] Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.623439 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.634469 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 10 06:26:35 crc kubenswrapper[4822]: W1010 06:26:35.645610 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d3fed7_bb74_4e55_a788_e354b2f0cd5c.slice/crio-bb75be37fa58f7db9f1ba1aa4e35c29bfa3f8d6fa32e547169b60f692a869498 WatchSource:0}: Error finding container bb75be37fa58f7db9f1ba1aa4e35c29bfa3f8d6fa32e547169b60f692a869498: Status 404 returned error can't find the container with id bb75be37fa58f7db9f1ba1aa4e35c29bfa3f8d6fa32e547169b60f692a869498 Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.649269 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.658464 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.659673 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.673434 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.696479 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.719040 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.721004 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.758932 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/248e0ae9-9e34-4c1e-bfbe-e60cbebc2444-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2fchz\" (UID: \"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.787920 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df9d10fb-c826-49b2-bc26-487b5a02822c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5554q\" (UID: \"df9d10fb-c826-49b2-bc26-487b5a02822c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.794361 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx7cj\" (UniqueName: \"kubernetes.io/projected/afbff33f-3af0-4edf-b6d7-60a42cc2bc0b-kube-api-access-vx7cj\") pod \"machine-config-controller-84d6567774-qgtpl\" (UID: \"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.815114 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmqv\" (UniqueName: \"kubernetes.io/projected/f25f417d-9d4b-426b-949e-b724290eb645-kube-api-access-5qmqv\") pod \"machine-config-operator-74547568cd-gh994\" (UID: \"f25f417d-9d4b-426b-949e-b724290eb645\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.818320 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.823787 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.831846 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.842274 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce67030c-749e-44e9-b900-843b66b80829-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.844922 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.847369 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmw5h\" (UniqueName: \"kubernetes.io/projected/23469a60-87a5-4400-a80b-2a3283833474-kube-api-access-qmw5h\") pod \"service-ca-9c57cc56f-nd8fx\" (UID: \"23469a60-87a5-4400-a80b-2a3283833474\") " pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.867731 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ttl\" (UniqueName: \"kubernetes.io/projected/714a8c0a-65f3-46b2-b40b-f4faa4aeb505-kube-api-access-n7ttl\") pod \"migrator-59844c95c7-bprpb\" (UID: \"714a8c0a-65f3-46b2-b40b-f4faa4aeb505\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.875658 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tr8l7"] Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.904691 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62jm\" (UniqueName: \"kubernetes.io/projected/e3700224-4110-4020-a014-a904f6710ce2-kube-api-access-p62jm\") pod \"service-ca-operator-777779d784-p5bvv\" (UID: \"e3700224-4110-4020-a014-a904f6710ce2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.914766 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.914915 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s"] Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.928012 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f"] Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.928048 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.941748 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.948024 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzqrz\" (UniqueName: \"kubernetes.io/projected/a176335f-a8bb-476a-bc6d-540be238a200-kube-api-access-dzqrz\") pod \"marketplace-operator-79b997595-cvtdr\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.948210 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvh7d\" (UniqueName: \"kubernetes.io/projected/2df5f09d-0d1c-40cf-9041-695d831d552d-kube-api-access-mvh7d\") pod \"collect-profiles-29334615-ssmbx\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.954642 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.960716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln56l\" (UniqueName: \"kubernetes.io/projected/5762cd1d-9023-42d1-908e-a2da9cf7f052-kube-api-access-ln56l\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.961103 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.969575 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5762cd1d-9023-42d1-908e-a2da9cf7f052-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hbrhm\" (UID: \"5762cd1d-9023-42d1-908e-a2da9cf7f052\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:35 crc kubenswrapper[4822]: I1010 06:26:35.992541 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pwms\" (UniqueName: \"kubernetes.io/projected/aed35222-5301-4df9-8f23-16816ebe4871-kube-api-access-8pwms\") pod \"etcd-operator-b45778765-hdbjq\" (UID: \"aed35222-5301-4df9-8f23-16816ebe4871\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.006453 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858l2\" (UniqueName: \"kubernetes.io/projected/b18ce54d-fc7a-46d0-a829-cb94946df57a-kube-api-access-858l2\") pod \"packageserver-d55dfcdfc-pvwp8\" (UID: \"b18ce54d-fc7a-46d0-a829-cb94946df57a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.017692 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9dtp5"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.034375 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2k8w\" (UniqueName: \"kubernetes.io/projected/dc8a3486-2bb5-49fd-99ed-09a9e743932c-kube-api-access-v2k8w\") pod \"router-default-5444994796-dpxkz\" (UID: \"dc8a3486-2bb5-49fd-99ed-09a9e743932c\") " pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.053441 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7v5b\" (UniqueName: \"kubernetes.io/projected/3d6d81c3-cd2f-4848-b9c5-2571af33e758-kube-api-access-w7v5b\") pod \"package-server-manager-789f6589d5-7trts\" (UID: \"3d6d81c3-cd2f-4848-b9c5-2571af33e758\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.076372 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld55k\" (UniqueName: \"kubernetes.io/projected/206453dd-5793-4461-be49-6d3de82b1431-kube-api-access-ld55k\") pod \"catalog-operator-68c6474976-9rx2j\" (UID: \"206453dd-5793-4461-be49-6d3de82b1431\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.095324 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.095957 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.102148 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7px2t\" (UniqueName: \"kubernetes.io/projected/ce67030c-749e-44e9-b900-843b66b80829-kube-api-access-7px2t\") pod \"cluster-image-registry-operator-dc59b4c8b-7c67x\" (UID: \"ce67030c-749e-44e9-b900-843b66b80829\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.118377 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9kf5"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.119182 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kznjd"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.125015 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.126774 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxjm\" (UniqueName: \"kubernetes.io/projected/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-kube-api-access-vpxjm\") pod \"oauth-openshift-558db77b4-pv22z\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.138417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8vv\" (UniqueName: \"kubernetes.io/projected/80899bb7-40c7-4bb1-8a61-d620fffdc036-kube-api-access-gd8vv\") pod \"openshift-controller-manager-operator-756b6f6bc6-mx968\" (UID: \"80899bb7-40c7-4bb1-8a61-d620fffdc036\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:36 crc kubenswrapper[4822]: W1010 06:26:36.142679 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0319e3d8_e3a7_499e_962a_efeb8bc2e3a3.slice/crio-7fc9c8d269015aaef2a76b70fc070fc4d402766c254c385719f39b44e1df443c WatchSource:0}: Error finding container 7fc9c8d269015aaef2a76b70fc070fc4d402766c254c385719f39b44e1df443c: Status 404 returned error can't find the container with id 7fc9c8d269015aaef2a76b70fc070fc4d402766c254c385719f39b44e1df443c Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.156110 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.163255 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.197241 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.213950 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a598b62a-71ad-438f-b83d-173c488d6e0a-node-bootstrap-token\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.213990 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da608e14-b5f2-4943-b344-ed8a280963b8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214034 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214059 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzxsq\" (UniqueName: \"kubernetes.io/projected/3d322408-d6af-47c5-afe2-995737d9d6e2-kube-api-access-dzxsq\") pod \"control-plane-machine-set-operator-78cbb6b69f-mfnp2\" (UID: \"3d322408-d6af-47c5-afe2-995737d9d6e2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214094 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27ef059-d8bc-44a1-8940-bcb6031a72b1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214115 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-bound-sa-token\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214132 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbrj\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-kube-api-access-7pbrj\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214189 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da608e14-b5f2-4943-b344-ed8a280963b8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a598b62a-71ad-438f-b83d-173c488d6e0a-certs\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214247 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39066f14-1817-4867-af96-ee099b009933-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214266 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxd2\" (UniqueName: \"kubernetes.io/projected/87c8c670-210c-4cb9-8e2b-805a80f2fcbd-kube-api-access-ksxd2\") pod \"multus-admission-controller-857f4d67dd-s5mxk\" (UID: \"87c8c670-210c-4cb9-8e2b-805a80f2fcbd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214314 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-certificates\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-srv-cert\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214372 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39066f14-1817-4867-af96-ee099b009933-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214440 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-tls\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214465 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwqmc\" (UniqueName: \"kubernetes.io/projected/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-kube-api-access-lwqmc\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214486 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87c8c670-210c-4cb9-8e2b-805a80f2fcbd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-s5mxk\" (UID: \"87c8c670-210c-4cb9-8e2b-805a80f2fcbd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214504 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27ef059-d8bc-44a1-8940-bcb6031a72b1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214521 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d322408-d6af-47c5-afe2-995737d9d6e2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mfnp2\" (UID: \"3d322408-d6af-47c5-afe2-995737d9d6e2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214566 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhqd\" (UniqueName: \"kubernetes.io/projected/a598b62a-71ad-438f-b83d-173c488d6e0a-kube-api-access-bkhqd\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214595 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214611 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da608e14-b5f2-4943-b344-ed8a280963b8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214627 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-trusted-ca\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.214647 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8wl2\" (UniqueName: \"kubernetes.io/projected/39066f14-1817-4867-af96-ee099b009933-kube-api-access-h8wl2\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.216401 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:36.716380381 +0000 UTC m=+143.811538577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.230143 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.292443 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.325328 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.325646 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.327442 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.328121 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da608e14-b5f2-4943-b344-ed8a280963b8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.328172 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-mountpoint-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.328218 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a598b62a-71ad-438f-b83d-173c488d6e0a-certs\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.328412 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39066f14-1817-4867-af96-ee099b009933-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.328503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxd2\" (UniqueName: \"kubernetes.io/projected/87c8c670-210c-4cb9-8e2b-805a80f2fcbd-kube-api-access-ksxd2\") pod \"multus-admission-controller-857f4d67dd-s5mxk\" (UID: \"87c8c670-210c-4cb9-8e2b-805a80f2fcbd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.329734 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-certificates\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.330638 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da608e14-b5f2-4943-b344-ed8a280963b8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.337726 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-certificates\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.338359 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:36.838325132 +0000 UTC m=+143.933483328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.338667 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-srv-cert\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.343349 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.343972 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39066f14-1817-4867-af96-ee099b009933-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344024 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-socket-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344172 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-tls\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344205 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwqmc\" (UniqueName: \"kubernetes.io/projected/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-kube-api-access-lwqmc\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344224 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87c8c670-210c-4cb9-8e2b-805a80f2fcbd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-s5mxk\" (UID: \"87c8c670-210c-4cb9-8e2b-805a80f2fcbd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344248 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17e41941-eca6-4d67-8d1f-097ae487537b-cert\") pod \"ingress-canary-p2d4j\" (UID: \"17e41941-eca6-4d67-8d1f-097ae487537b\") " pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344320 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d322408-d6af-47c5-afe2-995737d9d6e2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mfnp2\" (UID: \"3d322408-d6af-47c5-afe2-995737d9d6e2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344342 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27ef059-d8bc-44a1-8940-bcb6031a72b1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344410 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-plugins-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344473 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhqd\" (UniqueName: \"kubernetes.io/projected/a598b62a-71ad-438f-b83d-173c488d6e0a-kube-api-access-bkhqd\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344527 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72pct\" (UniqueName: \"kubernetes.io/projected/17e41941-eca6-4d67-8d1f-097ae487537b-kube-api-access-72pct\") pod \"ingress-canary-p2d4j\" (UID: \"17e41941-eca6-4d67-8d1f-097ae487537b\") " pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344644 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344668 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da608e14-b5f2-4943-b344-ed8a280963b8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344700 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ea45453-41a2-41f9-b512-8184264743de-metrics-tls\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344721 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-trusted-ca\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.344737 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a598b62a-71ad-438f-b83d-173c488d6e0a-certs\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.349429 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-srv-cert\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.349791 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.349898 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8wl2\" (UniqueName: \"kubernetes.io/projected/39066f14-1817-4867-af96-ee099b009933-kube-api-access-h8wl2\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.350134 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-registration-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.350868 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a598b62a-71ad-438f-b83d-173c488d6e0a-node-bootstrap-token\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.350917 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da608e14-b5f2-4943-b344-ed8a280963b8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.351839 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.353460 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-trusted-ca\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.355267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39066f14-1817-4867-af96-ee099b009933-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357075 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ea45453-41a2-41f9-b512-8184264743de-config-volume\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357132 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzxsq\" (UniqueName: \"kubernetes.io/projected/3d322408-d6af-47c5-afe2-995737d9d6e2-kube-api-access-dzxsq\") pod \"control-plane-machine-set-operator-78cbb6b69f-mfnp2\" (UID: \"3d322408-d6af-47c5-afe2-995737d9d6e2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357455 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-bound-sa-token\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357484 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbrj\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-kube-api-access-7pbrj\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357512 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-csi-data-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357543 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m2zl\" (UniqueName: \"kubernetes.io/projected/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-kube-api-access-7m2zl\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357661 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27ef059-d8bc-44a1-8940-bcb6031a72b1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357770 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmnsd\" (UniqueName: \"kubernetes.io/projected/9ea45453-41a2-41f9-b512-8184264743de-kube-api-access-cmnsd\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.357985 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-tls\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.359307 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27ef059-d8bc-44a1-8940-bcb6031a72b1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.360451 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:36.860422226 +0000 UTC m=+143.955580422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.362224 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27ef059-d8bc-44a1-8940-bcb6031a72b1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.366166 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d322408-d6af-47c5-afe2-995737d9d6e2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mfnp2\" (UID: \"3d322408-d6af-47c5-afe2-995737d9d6e2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.366669 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da608e14-b5f2-4943-b344-ed8a280963b8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.367669 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39066f14-1817-4867-af96-ee099b009933-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.369779 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a598b62a-71ad-438f-b83d-173c488d6e0a-node-bootstrap-token\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.390410 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.392192 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.396471 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87c8c670-210c-4cb9-8e2b-805a80f2fcbd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-s5mxk\" (UID: \"87c8c670-210c-4cb9-8e2b-805a80f2fcbd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.398052 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" event={"ID":"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3","Type":"ContainerStarted","Data":"7fc9c8d269015aaef2a76b70fc070fc4d402766c254c385719f39b44e1df443c"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.399351 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" event={"ID":"05179c47-551e-4445-bf6e-1f328d5f024c","Type":"ContainerStarted","Data":"9d93d429b3e8640fa93ad12b976813361a50485aa0224f36ddcf1642fe2cc85d"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.403204 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxd2\" (UniqueName: \"kubernetes.io/projected/87c8c670-210c-4cb9-8e2b-805a80f2fcbd-kube-api-access-ksxd2\") pod \"multus-admission-controller-857f4d67dd-s5mxk\" (UID: \"87c8c670-210c-4cb9-8e2b-805a80f2fcbd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.404128 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" event={"ID":"9d368e81-49c5-4a8c-8903-d393afe2e509","Type":"ContainerStarted","Data":"c7eefe98d432f8960c6785396a794c7aa91d600b6810dc14a868438179e1a8bb"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.404197 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" event={"ID":"9d368e81-49c5-4a8c-8903-d393afe2e509","Type":"ContainerStarted","Data":"bc5b68c3ecc09b63fd5a3f972a2fcaa0483d88db4b694d0e2489d496bb67b986"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.410768 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhqd\" (UniqueName: \"kubernetes.io/projected/a598b62a-71ad-438f-b83d-173c488d6e0a-kube-api-access-bkhqd\") pod \"machine-config-server-hq4fv\" (UID: \"a598b62a-71ad-438f-b83d-173c488d6e0a\") " pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.410910 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h59tt"] Oct 10 06:26:36 crc kubenswrapper[4822]: W1010 06:26:36.415067 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod741dd6ab_d5b3_421f_94d3_de6bd19c2f86.slice/crio-c0e1e010577c9936927ddc9534370e96006077c8f8be05aae477515473ba1328 WatchSource:0}: Error finding container c0e1e010577c9936927ddc9534370e96006077c8f8be05aae477515473ba1328: Status 404 returned error can't find the container with id c0e1e010577c9936927ddc9534370e96006077c8f8be05aae477515473ba1328 Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.416635 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" event={"ID":"28d3fed7-bb74-4e55-a788-e354b2f0cd5c","Type":"ContainerStarted","Data":"37366b8b8024d9348cc4188a125f2d2a1f0d1d3b239116fa633514722b225e29"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.416688 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" event={"ID":"28d3fed7-bb74-4e55-a788-e354b2f0cd5c","Type":"ContainerStarted","Data":"bb75be37fa58f7db9f1ba1aa4e35c29bfa3f8d6fa32e547169b60f692a869498"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.418854 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tr8l7" event={"ID":"6e88bcbb-f0a7-4837-96e0-6ced47adb39a","Type":"ContainerStarted","Data":"8a9363f041f968faeb0057d517ad14a7fb09c177493e4dbeee5e1775262adb3c"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.432141 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8wl2\" (UniqueName: \"kubernetes.io/projected/39066f14-1817-4867-af96-ee099b009933-kube-api-access-h8wl2\") pod \"kube-storage-version-migrator-operator-b67b599dd-rjz5p\" (UID: \"39066f14-1817-4867-af96-ee099b009933\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.434383 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" event={"ID":"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d","Type":"ContainerStarted","Data":"9b143182a700d0d6c528098beecc27f4e3522ce81c8050cc940d13e0bdc0d574"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.439737 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gfgpk"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.448001 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kvjlx"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.448166 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.449826 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" event={"ID":"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3","Type":"ContainerStarted","Data":"423b17e5777e5d00cf7ee505c35b88c1575cbc3973e422c83b134e06da4e1f2e"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.452007 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fwsh4"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.452489 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da608e14-b5f2-4943-b344-ed8a280963b8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2pmgw\" (UID: \"da608e14-b5f2-4943-b344-ed8a280963b8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.456498 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" event={"ID":"0c5524a5-19e5-425c-b94d-c6fd6c4fd916","Type":"ContainerStarted","Data":"eea2134a529864bb8dec11c0c771ef6bee89bd3084bfced3ec8b854510a7a735"} Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.458980 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459299 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17e41941-eca6-4d67-8d1f-097ae487537b-cert\") pod \"ingress-canary-p2d4j\" (UID: \"17e41941-eca6-4d67-8d1f-097ae487537b\") " pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459350 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-plugins-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459379 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72pct\" (UniqueName: \"kubernetes.io/projected/17e41941-eca6-4d67-8d1f-097ae487537b-kube-api-access-72pct\") pod \"ingress-canary-p2d4j\" (UID: \"17e41941-eca6-4d67-8d1f-097ae487537b\") " pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459411 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ea45453-41a2-41f9-b512-8184264743de-metrics-tls\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459465 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-registration-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459514 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ea45453-41a2-41f9-b512-8184264743de-config-volume\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459542 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-csi-data-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459563 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m2zl\" (UniqueName: \"kubernetes.io/projected/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-kube-api-access-7m2zl\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459605 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmnsd\" (UniqueName: \"kubernetes.io/projected/9ea45453-41a2-41f9-b512-8184264743de-kube-api-access-cmnsd\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459636 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-mountpoint-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.459700 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-socket-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.461399 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-registration-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.461558 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-plugins-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.464588 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-csi-data-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.464656 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-mountpoint-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.464682 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ea45453-41a2-41f9-b512-8184264743de-config-volume\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.464740 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-socket-dir\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.464772 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:36.964744933 +0000 UTC m=+144.059903169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.469675 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ea45453-41a2-41f9-b512-8184264743de-metrics-tls\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.474291 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.477651 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzxsq\" (UniqueName: \"kubernetes.io/projected/3d322408-d6af-47c5-afe2-995737d9d6e2-kube-api-access-dzxsq\") pod \"control-plane-machine-set-operator-78cbb6b69f-mfnp2\" (UID: \"3d322408-d6af-47c5-afe2-995737d9d6e2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.483610 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17e41941-eca6-4d67-8d1f-097ae487537b-cert\") pod \"ingress-canary-p2d4j\" (UID: \"17e41941-eca6-4d67-8d1f-097ae487537b\") " pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.495640 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwqmc\" (UniqueName: \"kubernetes.io/projected/12c90526-9d8f-4cf1-9adc-3b51ea32b8b3-kube-api-access-lwqmc\") pod \"olm-operator-6b444d44fb-q5kvv\" (UID: \"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: W1010 06:26:36.504108 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828ae265_a267_4af7_9893_671d038878b7.slice/crio-4d1c7dd8edd24558760c59d898a6c6d5559139439713a91851aa2089686370e5 WatchSource:0}: Error finding container 4d1c7dd8edd24558760c59d898a6c6d5559139439713a91851aa2089686370e5: Status 404 returned error can't find the container with id 4d1c7dd8edd24558760c59d898a6c6d5559139439713a91851aa2089686370e5 Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.506961 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" Oct 10 06:26:36 crc kubenswrapper[4822]: W1010 06:26:36.509842 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0649cc6_9ef6_4ecb_9a0e_fac537a3f208.slice/crio-1a84fff145105e53338569f1d6cc247e233d09f215b39f5a537f99e7b3aa0ef5 WatchSource:0}: Error finding container 1a84fff145105e53338569f1d6cc247e233d09f215b39f5a537f99e7b3aa0ef5: Status 404 returned error can't find the container with id 1a84fff145105e53338569f1d6cc247e233d09f215b39f5a537f99e7b3aa0ef5 Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.515504 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbrj\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-kube-api-access-7pbrj\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.537354 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-bound-sa-token\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.546195 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.564823 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.565538 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.065526708 +0000 UTC m=+144.160684904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.573274 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.583192 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hq4fv" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.597581 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72pct\" (UniqueName: \"kubernetes.io/projected/17e41941-eca6-4d67-8d1f-097ae487537b-kube-api-access-72pct\") pod \"ingress-canary-p2d4j\" (UID: \"17e41941-eca6-4d67-8d1f-097ae487537b\") " pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.611811 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m2zl\" (UniqueName: \"kubernetes.io/projected/f5ada1bf-ca71-45b9-885f-9bd75ba7d400-kube-api-access-7m2zl\") pod \"csi-hostpathplugin-ftmzs\" (UID: \"f5ada1bf-ca71-45b9-885f-9bd75ba7d400\") " pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.620133 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.629859 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p2d4j" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.630230 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmnsd\" (UniqueName: \"kubernetes.io/projected/9ea45453-41a2-41f9-b512-8184264743de-kube-api-access-cmnsd\") pod \"dns-default-r2qgn\" (UID: \"9ea45453-41a2-41f9-b512-8184264743de\") " pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.636017 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.678736 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.679693 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.179652402 +0000 UTC m=+144.274810588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.680630 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.681743 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.181726622 +0000 UTC m=+144.276884818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.702889 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.758878 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.760489 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.760544 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nd8fx"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.783004 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gh994"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.783267 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.783425 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.283404893 +0000 UTC m=+144.378563089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.783553 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.783881 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.283869866 +0000 UTC m=+144.379028062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.797525 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.804883 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.804951 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvtdr"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.808771 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.884967 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.885820 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.385786104 +0000 UTC m=+144.480944300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.886227 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.886879 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.386844035 +0000 UTC m=+144.482002231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.889108 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdbjq"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.897399 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.918545 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j"] Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.987012 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.987263 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.487235288 +0000 UTC m=+144.582393484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:36 crc kubenswrapper[4822]: I1010 06:26:36.987390 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:36 crc kubenswrapper[4822]: E1010 06:26:36.988030 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.488021281 +0000 UTC m=+144.583179477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.085111 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18ce54d_fc7a_46d0_a829_cb94946df57a.slice/crio-3853b6b09c3e9b7782c8eada52dc3521dc444bfbbf8e9358a42eb3bdabfe338e WatchSource:0}: Error finding container 3853b6b09c3e9b7782c8eada52dc3521dc444bfbbf8e9358a42eb3bdabfe338e: Status 404 returned error can't find the container with id 3853b6b09c3e9b7782c8eada52dc3521dc444bfbbf8e9358a42eb3bdabfe338e Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.092075 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.093220 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.593195914 +0000 UTC m=+144.688354110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.095298 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.095868 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.595849691 +0000 UTC m=+144.691007877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.097965 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df5f09d_0d1c_40cf_9041_695d831d552d.slice/crio-225d8beb42cfca01c1e0224c37c872598e6b92cd2ef71a1012c568ba82c82f1c WatchSource:0}: Error finding container 225d8beb42cfca01c1e0224c37c872598e6b92cd2ef71a1012c568ba82c82f1c: Status 404 returned error can't find the container with id 225d8beb42cfca01c1e0224c37c872598e6b92cd2ef71a1012c568ba82c82f1c Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.105171 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3700224_4110_4020_a014_a904f6710ce2.slice/crio-47519dda38e965fd01c2106bf5b56786be0717f95bf324866bb73cdcec938cc9 WatchSource:0}: Error finding container 47519dda38e965fd01c2106bf5b56786be0717f95bf324866bb73cdcec938cc9: Status 404 returned error can't find the container with id 47519dda38e965fd01c2106bf5b56786be0717f95bf324866bb73cdcec938cc9 Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.120830 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x"] Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.140018 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206453dd_5793_4461_be49_6d3de82b1431.slice/crio-ba21b93112464b645ead6ff3343c286d3e09ecd01167a13db823f3f37c4b5798 WatchSource:0}: Error finding container ba21b93112464b645ead6ff3343c286d3e09ecd01167a13db823f3f37c4b5798: Status 404 returned error can't find the container with id ba21b93112464b645ead6ff3343c286d3e09ecd01167a13db823f3f37c4b5798 Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.143840 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6d81c3_cd2f_4848_b9c5_2571af33e758.slice/crio-a8b16aeb005b39b695738a97ac18320f193d9771289c5c8b3c7bb20691203fb0 WatchSource:0}: Error finding container a8b16aeb005b39b695738a97ac18320f193d9771289c5c8b3c7bb20691203fb0: Status 404 returned error can't find the container with id a8b16aeb005b39b695738a97ac18320f193d9771289c5c8b3c7bb20691203fb0 Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.157676 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaed35222_5301_4df9_8f23_16816ebe4871.slice/crio-52e9598abeaa561260da485f7ce6830696be95dc75406ce69c2e38bd117c548b WatchSource:0}: Error finding container 52e9598abeaa561260da485f7ce6830696be95dc75406ce69c2e38bd117c548b: Status 404 returned error can't find the container with id 52e9598abeaa561260da485f7ce6830696be95dc75406ce69c2e38bd117c548b Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.167053 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda176335f_a8bb_476a_bc6d_540be238a200.slice/crio-df7a00a5a1bff4a0a6be0b4aceab3bccce85610d3555ea187ce4945e5e86c3fc WatchSource:0}: Error finding container df7a00a5a1bff4a0a6be0b4aceab3bccce85610d3555ea187ce4945e5e86c3fc: Status 404 returned error can't find the container with id df7a00a5a1bff4a0a6be0b4aceab3bccce85610d3555ea187ce4945e5e86c3fc Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.196082 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.196257 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.696230514 +0000 UTC m=+144.791388710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.196345 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.196948 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.696940485 +0000 UTC m=+144.792098681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.261954 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm"] Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.300091 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.300310 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.800277594 +0000 UTC m=+144.895435810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.300439 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.300903 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.800894412 +0000 UTC m=+144.896052618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.320210 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pv22z"] Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.401128 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.401459 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:37.90144179 +0000 UTC m=+144.996599986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.506576 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.507646 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.007629302 +0000 UTC m=+145.102787498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.508204 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" event={"ID":"f25f417d-9d4b-426b-949e-b724290eb645","Type":"ContainerStarted","Data":"28b27d27f9e6bbf05c7991e2b6961a2ff695d53d2b5fec6c1e4cb14e764c99a9"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.519113 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" event={"ID":"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755","Type":"ContainerStarted","Data":"724ae6e104919557b82c32e30ba88cab6891fb2a72d5202666d6470b900221d2"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.519671 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" event={"ID":"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755","Type":"ContainerStarted","Data":"f5d3a8afd6a5273ce50f5fcc2ce3503a34113ede55ae0a2496f49e28e9f01605"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.525712 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" event={"ID":"2df5f09d-0d1c-40cf-9041-695d831d552d","Type":"ContainerStarted","Data":"225d8beb42cfca01c1e0224c37c872598e6b92cd2ef71a1012c568ba82c82f1c"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.529729 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" event={"ID":"e3700224-4110-4020-a014-a904f6710ce2","Type":"ContainerStarted","Data":"47519dda38e965fd01c2106bf5b56786be0717f95bf324866bb73cdcec938cc9"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.535130 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hq4fv" event={"ID":"a598b62a-71ad-438f-b83d-173c488d6e0a","Type":"ContainerStarted","Data":"267c0c98d1fbf7535fa83c1811e6bf9408840b4606f8ebff02dea7e3a03b15ec"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.538890 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" event={"ID":"96d524e6-85da-48fc-a1b7-a56c007380e4","Type":"ContainerStarted","Data":"b39f7381d61b58be4826cc44d54f339935b44144b7ff697a2833f14fcc2d3b9a"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.541553 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.542594 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" event={"ID":"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b","Type":"ContainerStarted","Data":"96d3c7ef1e906673fcf60d67d0cf4f9151e4b0168c7d570521059b67a23f439a"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.544116 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" event={"ID":"cb0a67f4-5fa9-400f-9877-784faffe19fd","Type":"ContainerStarted","Data":"5f3f2b1f4a62f4eee3274ee42ef80db935e31fbdf5444e3d435a2119d84c6580"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.544149 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" event={"ID":"cb0a67f4-5fa9-400f-9877-784faffe19fd","Type":"ContainerStarted","Data":"b8f014dabe326c02f7b11abccd87b2fe5c707477c4322dd66840f2fae020e449"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.544214 4822 patch_prober.go:28] interesting pod/console-operator-58897d9998-gfgpk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.544276 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" podUID="96d524e6-85da-48fc-a1b7-a56c007380e4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.545851 4822 generic.go:334] "Generic (PLEG): container finished" podID="28d3fed7-bb74-4e55-a788-e354b2f0cd5c" containerID="37366b8b8024d9348cc4188a125f2d2a1f0d1d3b239116fa633514722b225e29" exitCode=0 Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.546042 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" event={"ID":"28d3fed7-bb74-4e55-a788-e354b2f0cd5c","Type":"ContainerDied","Data":"37366b8b8024d9348cc4188a125f2d2a1f0d1d3b239116fa633514722b225e29"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.553066 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" event={"ID":"a176335f-a8bb-476a-bc6d-540be238a200","Type":"ContainerStarted","Data":"df7a00a5a1bff4a0a6be0b4aceab3bccce85610d3555ea187ce4945e5e86c3fc"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.562622 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" event={"ID":"b18ce54d-fc7a-46d0-a829-cb94946df57a","Type":"ContainerStarted","Data":"3853b6b09c3e9b7782c8eada52dc3521dc444bfbbf8e9358a42eb3bdabfe338e"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.569640 4822 generic.go:334] "Generic (PLEG): container finished" podID="2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d" containerID="499141e5d9c8aa37a57c4af574ac8c6e28d6ae796cf80fe10e7a708211a0e35c" exitCode=0 Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.569851 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" event={"ID":"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d","Type":"ContainerDied","Data":"499141e5d9c8aa37a57c4af574ac8c6e28d6ae796cf80fe10e7a708211a0e35c"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.579290 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" event={"ID":"23469a60-87a5-4400-a80b-2a3283833474","Type":"ContainerStarted","Data":"b8c58528e0781f812e740b3adfbac86f726d99d5dc67d2892f5029afaf6556c6"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.587295 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" event={"ID":"df9d10fb-c826-49b2-bc26-487b5a02822c","Type":"ContainerStarted","Data":"f1bd68c8e5c2b8329dfa0e02086ffab8c4f02db060ceb15092b25822f141e7ac"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.597645 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" event={"ID":"714a8c0a-65f3-46b2-b40b-f4faa4aeb505","Type":"ContainerStarted","Data":"56b5fbb900da1ebb652f20515357b26bfa7eaf298904a9d421a9fd2f4e4d1dcd"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.618246 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.619155 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.119134029 +0000 UTC m=+145.214292225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.642186 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" event={"ID":"3d6d81c3-cd2f-4848-b9c5-2571af33e758","Type":"ContainerStarted","Data":"a8b16aeb005b39b695738a97ac18320f193d9771289c5c8b3c7bb20691203fb0"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.681115 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" event={"ID":"ce67030c-749e-44e9-b900-843b66b80829","Type":"ContainerStarted","Data":"bbeaf12eea49ace423b5016b856e81b230953fe02d231da228ac3e7457b4dce0"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.691764 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" event={"ID":"206453dd-5793-4461-be49-6d3de82b1431","Type":"ContainerStarted","Data":"ba21b93112464b645ead6ff3343c286d3e09ecd01167a13db823f3f37c4b5798"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.695084 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" event={"ID":"5762cd1d-9023-42d1-908e-a2da9cf7f052","Type":"ContainerStarted","Data":"e52a0aca72e2fa1b0e12c781ebdc8b041ca84db3c531a91d97040da872db5d6e"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.722009 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.722584 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.22256873 +0000 UTC m=+145.317726926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.737355 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" event={"ID":"828ae265-a267-4af7-9893-671d038878b7","Type":"ContainerStarted","Data":"5b0d14f41cb8dfe8c1eb02c6ef91b8d14a627dda5f2200e940444c7a010fa91c"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.737396 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" event={"ID":"828ae265-a267-4af7-9893-671d038878b7","Type":"ContainerStarted","Data":"4d1c7dd8edd24558760c59d898a6c6d5559139439713a91851aa2089686370e5"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.749612 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" event={"ID":"741dd6ab-d5b3-421f-94d3-de6bd19c2f86","Type":"ContainerStarted","Data":"fe1308a818c100482767ad53d7acd85403ee379b8a6b6964405b48f31c1b33ed"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.749648 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" event={"ID":"741dd6ab-d5b3-421f-94d3-de6bd19c2f86","Type":"ContainerStarted","Data":"c0e1e010577c9936927ddc9534370e96006077c8f8be05aae477515473ba1328"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.764022 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" event={"ID":"aed35222-5301-4df9-8f23-16816ebe4871","Type":"ContainerStarted","Data":"52e9598abeaa561260da485f7ce6830696be95dc75406ce69c2e38bd117c548b"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.772161 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" event={"ID":"05179c47-551e-4445-bf6e-1f328d5f024c","Type":"ContainerStarted","Data":"60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.773696 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.779833 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" event={"ID":"0c5524a5-19e5-425c-b94d-c6fd6c4fd916","Type":"ContainerStarted","Data":"c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.780449 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.786334 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dpxkz" event={"ID":"dc8a3486-2bb5-49fd-99ed-09a9e743932c","Type":"ContainerStarted","Data":"3e122bd284657199a40ced53aef752e3af0d3c0656e7e4ae0ebad98b5e4d9187"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.791871 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kvjlx" event={"ID":"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208","Type":"ContainerStarted","Data":"69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.791914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kvjlx" event={"ID":"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208","Type":"ContainerStarted","Data":"1a84fff145105e53338569f1d6cc247e233d09f215b39f5a537f99e7b3aa0ef5"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.797137 4822 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z9kf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.797198 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.798414 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" event={"ID":"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3","Type":"ContainerStarted","Data":"350da90019c55c4cc8ce10f8235d6380b075d33d4c0eff2b82806f409df36665"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.802021 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" event={"ID":"a3f9e4cd-d614-4a34-9c5f-c097103e65fc","Type":"ContainerStarted","Data":"1da5f62b7473da8e63bea5ce42df4bfee11642f53fb0c76379c0e69de6d5e807"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.814420 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" event={"ID":"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444","Type":"ContainerStarted","Data":"c3022a4dfec106cd71d4b7efb310bbe32f0038de7f297df209ca117b8c717f96"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.822562 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tr8l7" event={"ID":"6e88bcbb-f0a7-4837-96e0-6ced47adb39a","Type":"ContainerStarted","Data":"807749688bb2081334cd976c98c7df3f8ef12d4c5e3b02dbaade930e524e7c49"} Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.824443 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.825773 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.826022 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.325992852 +0000 UTC m=+145.421151068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.826369 4822 patch_prober.go:28] interesting pod/downloads-7954f5f757-tr8l7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.827031 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tr8l7" podUID="6e88bcbb-f0a7-4837-96e0-6ced47adb39a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.827619 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.833494 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.33347821 +0000 UTC m=+145.428636406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.888057 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.917985 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2"] Oct 10 06:26:37 crc kubenswrapper[4822]: I1010 06:26:37.944183 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:37 crc kubenswrapper[4822]: E1010 06:26:37.944884 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.444865654 +0000 UTC m=+145.540023850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:37 crc kubenswrapper[4822]: W1010 06:26:37.967112 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d322408_d6af_47c5_afe2_995737d9d6e2.slice/crio-85e1b5d633c85246b62a95d7dfa12b86b64d08c76f55fedadab8e3a0b8d8a86a WatchSource:0}: Error finding container 85e1b5d633c85246b62a95d7dfa12b86b64d08c76f55fedadab8e3a0b8d8a86a: Status 404 returned error can't find the container with id 85e1b5d633c85246b62a95d7dfa12b86b64d08c76f55fedadab8e3a0b8d8a86a Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.030051 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ftmzs"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.033449 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.043765 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.046696 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.047082 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.54706611 +0000 UTC m=+145.642224306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.088454 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p2d4j"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.094977 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-s5mxk"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.097627 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.102558 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.102755 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.102786 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.117722 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dpxkz" podStartSLOduration=124.117701457 podStartE2EDuration="2m4.117701457s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.084268063 +0000 UTC m=+145.179426289" watchObservedRunningTime="2025-10-10 06:26:38.117701457 +0000 UTC m=+145.212859653" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.129470 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" podStartSLOduration=124.129447199 podStartE2EDuration="2m4.129447199s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.109412555 +0000 UTC m=+145.204570751" watchObservedRunningTime="2025-10-10 06:26:38.129447199 +0000 UTC m=+145.224605395" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.147623 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.148077 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.648061141 +0000 UTC m=+145.743219337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.168217 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kvjlx" podStartSLOduration=124.168197227 podStartE2EDuration="2m4.168197227s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.166422045 +0000 UTC m=+145.261580261" watchObservedRunningTime="2025-10-10 06:26:38.168197227 +0000 UTC m=+145.263355423" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.185716 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r2qgn"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.250094 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.250481 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.750466533 +0000 UTC m=+145.845624739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.267648 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw"] Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.350860 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.351423 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.851403862 +0000 UTC m=+145.946562058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.428509 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sg7rz" podStartSLOduration=124.428467696 podStartE2EDuration="2m4.428467696s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.383187238 +0000 UTC m=+145.478345434" watchObservedRunningTime="2025-10-10 06:26:38.428467696 +0000 UTC m=+145.523625892" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.452164 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.452646 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:38.95263154 +0000 UTC m=+146.047789736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: W1010 06:26:38.462094 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda608e14_b5f2_4943_b344_ed8a280963b8.slice/crio-058e7d80ece6f2e9b7fa0f990955916035c67f3ed5a1f952d5669ad535b2e0f2 WatchSource:0}: Error finding container 058e7d80ece6f2e9b7fa0f990955916035c67f3ed5a1f952d5669ad535b2e0f2: Status 404 returned error can't find the container with id 058e7d80ece6f2e9b7fa0f990955916035c67f3ed5a1f952d5669ad535b2e0f2 Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.462821 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tr8l7" podStartSLOduration=124.462786576 podStartE2EDuration="2m4.462786576s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.42824666 +0000 UTC m=+145.523404866" watchObservedRunningTime="2025-10-10 06:26:38.462786576 +0000 UTC m=+145.557944772" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.488409 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" podStartSLOduration=124.488381371 podStartE2EDuration="2m4.488381371s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.479161292 +0000 UTC m=+145.574319498" watchObservedRunningTime="2025-10-10 06:26:38.488381371 +0000 UTC m=+145.583539587" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.555844 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.556307 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.056290188 +0000 UTC m=+146.151448384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.558513 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fwsh4" podStartSLOduration=124.558487223 podStartE2EDuration="2m4.558487223s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.550742197 +0000 UTC m=+145.645900393" watchObservedRunningTime="2025-10-10 06:26:38.558487223 +0000 UTC m=+145.653645419" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.658965 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.659639 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.159615067 +0000 UTC m=+146.254773323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.673271 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" podStartSLOduration=123.673252234 podStartE2EDuration="2m3.673252234s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.672525263 +0000 UTC m=+145.767683459" watchObservedRunningTime="2025-10-10 06:26:38.673252234 +0000 UTC m=+145.768410430" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.760367 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.760717 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.26067318 +0000 UTC m=+146.355831376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.843757 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" event={"ID":"a176335f-a8bb-476a-bc6d-540be238a200","Type":"ContainerStarted","Data":"f05936b518bd89d54b5b85413c66e23cffeac395745311b2c6bde21d5ee7a831"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.845307 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.847960 4822 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cvtdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.848020 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.851681 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" event={"ID":"df9d10fb-c826-49b2-bc26-487b5a02822c","Type":"ContainerStarted","Data":"945aa0f3a47179faf189ad77e81123a22cb5d0c159c2191a5e5a5331d5fb709f"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.862260 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.864397 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.36438351 +0000 UTC m=+146.459541706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.864613 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" event={"ID":"5762cd1d-9023-42d1-908e-a2da9cf7f052","Type":"ContainerStarted","Data":"94ee6da8420f33b21aa3ca84288e9746b3c2d1567b042faad028fa71a1a8fc11"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.878213 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" podStartSLOduration=123.878193822 podStartE2EDuration="2m3.878193822s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.875630718 +0000 UTC m=+145.970788924" watchObservedRunningTime="2025-10-10 06:26:38.878193822 +0000 UTC m=+145.973352028" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.911368 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5554q" podStartSLOduration=124.911338317 podStartE2EDuration="2m4.911338317s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.906113875 +0000 UTC m=+146.001272081" watchObservedRunningTime="2025-10-10 06:26:38.911338317 +0000 UTC m=+146.006496523" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.920825 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" event={"ID":"2df5f09d-0d1c-40cf-9041-695d831d552d","Type":"ContainerStarted","Data":"4028de656a268672b0a3b3809074bd83441264c07fc5342630745c98b932cd3b"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.929725 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" event={"ID":"f25f417d-9d4b-426b-949e-b724290eb645","Type":"ContainerStarted","Data":"275648e16d453933fda608c503de9e1d35652aa8ab41e1ad0aade982576d8b0d"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.945363 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dpxkz" event={"ID":"dc8a3486-2bb5-49fd-99ed-09a9e743932c","Type":"ContainerStarted","Data":"9e8e1306fa8e46eb3b9b24a65eca7c62b0c70d0b920a32f9fa4abf74e0862588"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.962322 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" podStartSLOduration=124.962303702 podStartE2EDuration="2m4.962303702s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.961684804 +0000 UTC m=+146.056843020" watchObservedRunningTime="2025-10-10 06:26:38.962303702 +0000 UTC m=+146.057461898" Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.963647 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:38 crc kubenswrapper[4822]: E1010 06:26:38.964672 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.464631679 +0000 UTC m=+146.559789985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.970211 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" event={"ID":"cb0a67f4-5fa9-400f-9877-784faffe19fd","Type":"ContainerStarted","Data":"5532f8fa9e4fa164e27e41091448d4b16fead5cbf5df132eb085ae935c64452f"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.978901 4822 generic.go:334] "Generic (PLEG): container finished" podID="f6bbb19f-3429-4af5-a28a-a0d0815f8ff3" containerID="31db4050ff698585ae1a5f51518e22d237bbbeb48c5ce53a9a3d395ede00a0e3" exitCode=0 Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.978963 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" event={"ID":"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3","Type":"ContainerDied","Data":"31db4050ff698585ae1a5f51518e22d237bbbeb48c5ce53a9a3d395ede00a0e3"} Oct 10 06:26:38 crc kubenswrapper[4822]: I1010 06:26:38.991309 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nhr9f" podStartSLOduration=124.991294106 podStartE2EDuration="2m4.991294106s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:38.989547105 +0000 UTC m=+146.084705301" watchObservedRunningTime="2025-10-10 06:26:38.991294106 +0000 UTC m=+146.086452292" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.005208 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" event={"ID":"27d8f9ac-f418-46b8-9f7a-8bfc8dde1755","Type":"ContainerStarted","Data":"f9baebc9e68758fac68d0fe3895f21a24d66f576a4040c53d1638906c023d695"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.010886 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2qgn" event={"ID":"9ea45453-41a2-41f9-b512-8184264743de","Type":"ContainerStarted","Data":"3a9d7c0b73fe0e057fc95e858051d26bf0e81c5fd51e5d1f4df9dc4c50fa7753"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.020397 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" event={"ID":"39066f14-1817-4867-af96-ee099b009933","Type":"ContainerStarted","Data":"efaf0077e0dbc23a64e4561884bc737f8f8421bc1472b4543c0c392cc2308049"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.031620 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" event={"ID":"28d3fed7-bb74-4e55-a788-e354b2f0cd5c","Type":"ContainerStarted","Data":"80b76cdc2a9a885eb8c263c615a8be262434abd3e8f8fe858e32e71c212f1d3c"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.032223 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.035600 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" event={"ID":"80899bb7-40c7-4bb1-8a61-d620fffdc036","Type":"ContainerStarted","Data":"463ad946279b7e3526e1e1f79db901991bc4da55e4d50dff89999520beae6a0d"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.037524 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hq4fv" event={"ID":"a598b62a-71ad-438f-b83d-173c488d6e0a","Type":"ContainerStarted","Data":"f0cb1f9579de3b4d89c5037c233c1bf49b3efd335b2467a6d56e9ac7a3e284c4"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.046129 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-h59tt" podStartSLOduration=125.046107542 podStartE2EDuration="2m5.046107542s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.043594879 +0000 UTC m=+146.138753085" watchObservedRunningTime="2025-10-10 06:26:39.046107542 +0000 UTC m=+146.141265758" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.063979 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" event={"ID":"e3700224-4110-4020-a014-a904f6710ce2","Type":"ContainerStarted","Data":"50e22229987711bd7ab6c782f902140a6436c7fa4af8b03aa96a0a0489c4dab9"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.065867 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.068720 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.568679029 +0000 UTC m=+146.663837225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.069742 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" podStartSLOduration=125.069710629 podStartE2EDuration="2m5.069710629s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.066371682 +0000 UTC m=+146.161529888" watchObservedRunningTime="2025-10-10 06:26:39.069710629 +0000 UTC m=+146.164868825" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.085644 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" event={"ID":"248e0ae9-9e34-4c1e-bfbe-e60cbebc2444","Type":"ContainerStarted","Data":"9d244c0e32ae1d247a9a22ff31ffc8d96f70045cf5f7add4843f26c4cb3c82f5"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.095236 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" event={"ID":"23469a60-87a5-4400-a80b-2a3283833474","Type":"ContainerStarted","Data":"de9eeb1d35079657649ed74fcbb09a5ebffeca538a2c66b899bf25223ace553a"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.104568 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" event={"ID":"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b","Type":"ContainerStarted","Data":"fdc9c108b817f222a19be649989d09181a80b1671910475ac9fc9ae71f6af62c"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.104623 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" event={"ID":"afbff33f-3af0-4edf-b6d7-60a42cc2bc0b","Type":"ContainerStarted","Data":"9ddbb8f3be7cbedfd25d52f331b902e123d0fed9d781f403bbc5fbbd2ce3f57d"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.127305 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2fchz" podStartSLOduration=125.127274276 podStartE2EDuration="2m5.127274276s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.103561485 +0000 UTC m=+146.198719681" watchObservedRunningTime="2025-10-10 06:26:39.127274276 +0000 UTC m=+146.222432472" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.128417 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hq4fv" podStartSLOduration=6.128407469 podStartE2EDuration="6.128407469s" podCreationTimestamp="2025-10-10 06:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.085972573 +0000 UTC m=+146.181130789" watchObservedRunningTime="2025-10-10 06:26:39.128407469 +0000 UTC m=+146.223565665" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.152492 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:39 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:39 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:39 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.152601 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.153196 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" event={"ID":"b18ce54d-fc7a-46d0-a829-cb94946df57a","Type":"ContainerStarted","Data":"7f6d9663899c18d443b5ecf38ccdb5e2fbfc1dd0ab76876fdf8562dccbd8168d"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.154493 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.161278 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" event={"ID":"f5ada1bf-ca71-45b9-885f-9bd75ba7d400","Type":"ContainerStarted","Data":"0c1d52565f05d08ccbb196ceedb04f838d80304bc8ac52fb0e7fc1d102e84bc5"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.174697 4822 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pvwp8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.174754 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" podUID="b18ce54d-fc7a-46d0-a829-cb94946df57a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.180418 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p5bvv" podStartSLOduration=124.180401643 podStartE2EDuration="2m4.180401643s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.147058002 +0000 UTC m=+146.242216198" watchObservedRunningTime="2025-10-10 06:26:39.180401643 +0000 UTC m=+146.275559839" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.184337 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.185766 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.685747598 +0000 UTC m=+146.780905784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.188369 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" event={"ID":"96d524e6-85da-48fc-a1b7-a56c007380e4","Type":"ContainerStarted","Data":"fedfa94789e2c5fef96763c97bd50d09e872dad3bdfee88ff56ceb66361b8782"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.189062 4822 patch_prober.go:28] interesting pod/console-operator-58897d9998-gfgpk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.189111 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" podUID="96d524e6-85da-48fc-a1b7-a56c007380e4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.202274 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" event={"ID":"87c8c670-210c-4cb9-8e2b-805a80f2fcbd","Type":"ContainerStarted","Data":"2123af24b6a2cb19f24c5cbd74921211104abf34f98a18db57c9f03351a2c6bf"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.212039 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" event={"ID":"aed35222-5301-4df9-8f23-16816ebe4871","Type":"ContainerStarted","Data":"99cc474ed7aca4980a985fb18c80cb6310dcea1bfbd9f8ccc4eabf362f6bb2d8"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.224753 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" event={"ID":"0319e3d8-e3a7-499e-962a-efeb8bc2e3a3","Type":"ContainerStarted","Data":"69b0434cfd0d912d1a814a34a5d5ec5e4eb2e199ccfcee767c5e4c3c8213207b"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.231873 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nd8fx" podStartSLOduration=124.231850631 podStartE2EDuration="2m4.231850631s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.179323491 +0000 UTC m=+146.274481687" watchObservedRunningTime="2025-10-10 06:26:39.231850631 +0000 UTC m=+146.327008827" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.232845 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qgtpl" podStartSLOduration=125.23283925 podStartE2EDuration="2m5.23283925s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.210839679 +0000 UTC m=+146.305997895" watchObservedRunningTime="2025-10-10 06:26:39.23283925 +0000 UTC m=+146.327997446" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.236269 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" podStartSLOduration=124.236253599 podStartE2EDuration="2m4.236253599s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.233734236 +0000 UTC m=+146.328892432" watchObservedRunningTime="2025-10-10 06:26:39.236253599 +0000 UTC m=+146.331411795" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.241215 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" event={"ID":"9d368e81-49c5-4a8c-8903-d393afe2e509","Type":"ContainerStarted","Data":"1c6c5e1e2312f9ad05538ca9d98e3f610b254b66f3e4caa62fd98e7b9ea0b938"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.250229 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" event={"ID":"714a8c0a-65f3-46b2-b40b-f4faa4aeb505","Type":"ContainerStarted","Data":"628a8d1897f823451fa3092c15c17651049cfa2566d5dc14fe5f365d56171785"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.250306 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" event={"ID":"714a8c0a-65f3-46b2-b40b-f4faa4aeb505","Type":"ContainerStarted","Data":"02ce4e40635112389897141ab7f3799ab43a88fcc974fcc7379a361129e07a64"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.260046 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hdbjq" podStartSLOduration=125.260029741 podStartE2EDuration="2m5.260029741s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.259178037 +0000 UTC m=+146.354336273" watchObservedRunningTime="2025-10-10 06:26:39.260029741 +0000 UTC m=+146.355187937" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.260636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p2d4j" event={"ID":"17e41941-eca6-4d67-8d1f-097ae487537b","Type":"ContainerStarted","Data":"3f9c878759ce896807914e19fbd332db61cc114a193d5ae9de3f517f3096affe"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.279722 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" event={"ID":"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3","Type":"ContainerStarted","Data":"3b581c66333e2cd4bf38840cee0879417407300038974076aaac3f61982bc79d"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.280849 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.288509 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.293329 4822 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q5kvv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.293382 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" podUID="12c90526-9d8f-4cf1-9adc-3b51ea32b8b3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.294512 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.794490575 +0000 UTC m=+146.889648871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.295544 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" event={"ID":"206453dd-5793-4461-be49-6d3de82b1431","Type":"ContainerStarted","Data":"6449604fb00ac435b1bd3c77cef69d010857e524629df7be79fd9e6f005976ca"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.295759 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.298857 4822 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9rx2j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.298914 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" podUID="206453dd-5793-4461-be49-6d3de82b1431" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.325530 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" event={"ID":"da608e14-b5f2-4943-b344-ed8a280963b8","Type":"ContainerStarted","Data":"058e7d80ece6f2e9b7fa0f990955916035c67f3ed5a1f952d5669ad535b2e0f2"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.331174 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bprpb" podStartSLOduration=125.331157503 podStartE2EDuration="2m5.331157503s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.330412361 +0000 UTC m=+146.425570567" watchObservedRunningTime="2025-10-10 06:26:39.331157503 +0000 UTC m=+146.426315699" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.332727 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9dtp5" podStartSLOduration=125.332694827 podStartE2EDuration="2m5.332694827s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.305004751 +0000 UTC m=+146.400162967" watchObservedRunningTime="2025-10-10 06:26:39.332694827 +0000 UTC m=+146.427853023" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.340267 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.343154 4822 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pv22z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.343199 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.344571 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" event={"ID":"3d6d81c3-cd2f-4848-b9c5-2571af33e758","Type":"ContainerStarted","Data":"94645e6ac9e8d7d756f5816e20a81e86dc36811cbe4fa34c4cb2ff5436c6f8ca"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.361665 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfzbs" podStartSLOduration=125.361645641 podStartE2EDuration="2m5.361645641s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.358070526 +0000 UTC m=+146.453228712" watchObservedRunningTime="2025-10-10 06:26:39.361645641 +0000 UTC m=+146.456803837" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.369708 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" event={"ID":"3d322408-d6af-47c5-afe2-995737d9d6e2","Type":"ContainerStarted","Data":"85e1b5d633c85246b62a95d7dfa12b86b64d08c76f55fedadab8e3a0b8d8a86a"} Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.370269 4822 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z9kf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.370303 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.382387 4822 patch_prober.go:28] interesting pod/downloads-7954f5f757-tr8l7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.382441 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tr8l7" podUID="6e88bcbb-f0a7-4837-96e0-6ced47adb39a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.393984 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.395335 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.895304401 +0000 UTC m=+146.990462597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.422618 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" podStartSLOduration=124.422575845 podStartE2EDuration="2m4.422575845s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.421246126 +0000 UTC m=+146.516404342" watchObservedRunningTime="2025-10-10 06:26:39.422575845 +0000 UTC m=+146.517734051" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.471562 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" podStartSLOduration=124.47153112 podStartE2EDuration="2m4.47153112s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.468747589 +0000 UTC m=+146.563905785" watchObservedRunningTime="2025-10-10 06:26:39.47153112 +0000 UTC m=+146.566689316" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.496916 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.497278 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:39.99726699 +0000 UTC m=+147.092425186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.527618 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" podStartSLOduration=125.527594493 podStartE2EDuration="2m5.527594493s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:39.524922455 +0000 UTC m=+146.620080671" watchObservedRunningTime="2025-10-10 06:26:39.527594493 +0000 UTC m=+146.622752689" Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.597728 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.598291 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.098253241 +0000 UTC m=+147.193411437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.704943 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.705526 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.205510654 +0000 UTC m=+147.300668850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.807830 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.808463 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.308437301 +0000 UTC m=+147.403595497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:39 crc kubenswrapper[4822]: I1010 06:26:39.919426 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:39 crc kubenswrapper[4822]: E1010 06:26:39.923617 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.423598105 +0000 UTC m=+147.518756301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.004241 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.025323 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.025797 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.52576604 +0000 UTC m=+147.620924236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.101664 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:40 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:40 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:40 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.102030 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.127129 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.127473 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.627458461 +0000 UTC m=+147.722616657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.230919 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.231318 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.731303625 +0000 UTC m=+147.826461811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.332344 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.333000 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.832970966 +0000 UTC m=+147.928129332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.388829 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" event={"ID":"12c90526-9d8f-4cf1-9adc-3b51ea32b8b3","Type":"ContainerStarted","Data":"28b9a6017e0bdc60e81a2585b1ac6bcfa2ca8c0ba18cfb3aca73f5df827c44f3"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.389354 4822 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q5kvv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.389418 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" podUID="12c90526-9d8f-4cf1-9adc-3b51ea32b8b3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.391831 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" event={"ID":"5762cd1d-9023-42d1-908e-a2da9cf7f052","Type":"ContainerStarted","Data":"a6ffe924c67408e63e117cf28416478db3f892ebc6a7e4b799738433f527ac8c"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.394717 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" event={"ID":"da608e14-b5f2-4943-b344-ed8a280963b8","Type":"ContainerStarted","Data":"d6aa0fb882572a959aee0c8b1cbbc8621c3f57dcbf13bc77a9e04234e383f344"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.396577 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" event={"ID":"ce67030c-749e-44e9-b900-843b66b80829","Type":"ContainerStarted","Data":"acf13832d58fb9fa48ead29170646d0dd322adb25aedb8a533a13e439fd71cf0"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.400341 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" event={"ID":"3d6d81c3-cd2f-4848-b9c5-2571af33e758","Type":"ContainerStarted","Data":"a4d3c2a558a2529358cdb4f45b8e81628bce60d03909e95ed7211b92efea6f6c"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.400511 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.402487 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" event={"ID":"f6bbb19f-3429-4af5-a28a-a0d0815f8ff3","Type":"ContainerStarted","Data":"b6596c0ed0dbe51e5b0fb2b9503fb65b8e36f5f924a5afbac9d6a1959e5f1de3"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.406559 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" event={"ID":"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d","Type":"ContainerStarted","Data":"940eff88d4e0868417ff65b66ff712d2fd1423e3b1a81ffc5fb5d37508e0dc1c"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.406636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" event={"ID":"2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d","Type":"ContainerStarted","Data":"3cb9aa1914fcc079397e2347ea66b2dd7bf89c10796b09dd82021bd271ce3a06"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.407935 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" event={"ID":"80899bb7-40c7-4bb1-8a61-d620fffdc036","Type":"ContainerStarted","Data":"40c03a530133589d0a7db5f5ffa279934d32f757afdac0ad6b02ef453799d1cd"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.410276 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p2d4j" event={"ID":"17e41941-eca6-4d67-8d1f-097ae487537b","Type":"ContainerStarted","Data":"ff245d143604ac7c387932d017582106af8115ff9712562128918bf1f7c4d2e7"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.411982 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" event={"ID":"3d322408-d6af-47c5-afe2-995737d9d6e2","Type":"ContainerStarted","Data":"44f016761e4761d1c514fccb0cfa6afd90ec26fcda78b7eee4f62b3a5816a6eb"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.413521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" event={"ID":"39066f14-1817-4867-af96-ee099b009933","Type":"ContainerStarted","Data":"812a7c0bcde36a4a1d37b0ed3a0229418b2ac991ef9ca788dc81662c4300fc70"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.415724 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2qgn" event={"ID":"9ea45453-41a2-41f9-b512-8184264743de","Type":"ContainerStarted","Data":"301831d9dda5e0aa703a842c10ddceb588077e4c6a7f4ab6248ec99a7cb61863"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.415859 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.418188 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" event={"ID":"f25f417d-9d4b-426b-949e-b724290eb645","Type":"ContainerStarted","Data":"38ca054dc15ab7fc791ab30d3594ac698e7263de39dffa719b376b2dbdef3a2e"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.420650 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" event={"ID":"a3f9e4cd-d614-4a34-9c5f-c097103e65fc","Type":"ContainerStarted","Data":"af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.422197 4822 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pv22z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.422301 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.427070 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" event={"ID":"87c8c670-210c-4cb9-8e2b-805a80f2fcbd","Type":"ContainerStarted","Data":"9804d4f088de95e90b47f0c78e838947064454cb2347c81b06a257c3c88fb0af"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.427322 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" event={"ID":"87c8c670-210c-4cb9-8e2b-805a80f2fcbd","Type":"ContainerStarted","Data":"9b0f88b0399c5f9e698ae14d5344603ea00bea16d8fb6f5f2f05d72217fbb70f"} Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.428043 4822 patch_prober.go:28] interesting pod/downloads-7954f5f757-tr8l7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.428124 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tr8l7" podUID="6e88bcbb-f0a7-4837-96e0-6ced47adb39a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.428668 4822 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cvtdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.428696 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.429190 4822 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9rx2j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.429267 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" podUID="206453dd-5793-4461-be49-6d3de82b1431" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.433638 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.435094 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:40.935064929 +0000 UTC m=+148.030223155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.439951 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hbrhm" podStartSLOduration=126.43992004 podStartE2EDuration="2m6.43992004s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.43855063 +0000 UTC m=+147.533708836" watchObservedRunningTime="2025-10-10 06:26:40.43992004 +0000 UTC m=+147.535078236" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.490279 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gfgpk" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.543409 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.547040 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.047021459 +0000 UTC m=+148.142179655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.553636 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.554097 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.565542 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p2d4j" podStartSLOduration=7.565521187 podStartE2EDuration="7.565521187s" podCreationTimestamp="2025-10-10 06:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.510168516 +0000 UTC m=+147.605326712" watchObservedRunningTime="2025-10-10 06:26:40.565521187 +0000 UTC m=+147.660679384" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.577465 4822 patch_prober.go:28] interesting pod/apiserver-76f77b778f-kznjd container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.577614 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" podUID="2be59d8f-4f1a-41bf-a5f0-6dbe2df4cb5d" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.585358 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.586659 4822 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-p8fht container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.586739 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" podUID="f6bbb19f-3429-4af5-a28a-a0d0815f8ff3" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.601543 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.625960 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mfnp2" podStartSLOduration=126.625932307 podStartE2EDuration="2m6.625932307s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.577930089 +0000 UTC m=+147.673088295" watchObservedRunningTime="2025-10-10 06:26:40.625932307 +0000 UTC m=+147.721090503" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.657165 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7c67x" podStartSLOduration=126.657141326 podStartE2EDuration="2m6.657141326s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.614065201 +0000 UTC m=+147.709223397" watchObservedRunningTime="2025-10-10 06:26:40.657141326 +0000 UTC m=+147.752299512" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.658500 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.659160 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.159130433 +0000 UTC m=+148.254288629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.678018 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rjz5p" podStartSLOduration=126.677999773 podStartE2EDuration="2m6.677999773s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.657943199 +0000 UTC m=+147.753101405" watchObservedRunningTime="2025-10-10 06:26:40.677999773 +0000 UTC m=+147.773157969" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.733405 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" podStartSLOduration=125.733371015 podStartE2EDuration="2m5.733371015s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.723476647 +0000 UTC m=+147.818634843" watchObservedRunningTime="2025-10-10 06:26:40.733371015 +0000 UTC m=+147.828529211" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.761017 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.761457 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.261441733 +0000 UTC m=+148.356599929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.809349 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-s5mxk" podStartSLOduration=125.809323227 podStartE2EDuration="2m5.809323227s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.808225715 +0000 UTC m=+147.903383921" watchObservedRunningTime="2025-10-10 06:26:40.809323227 +0000 UTC m=+147.904481423" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.862681 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.863219 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.363174735 +0000 UTC m=+148.458332931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.863530 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.864074 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.364055891 +0000 UTC m=+148.459214087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.891307 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gh994" podStartSLOduration=126.891285154 podStartE2EDuration="2m6.891285154s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.841203155 +0000 UTC m=+147.936361351" watchObservedRunningTime="2025-10-10 06:26:40.891285154 +0000 UTC m=+147.986443350" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.915385 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mx968" podStartSLOduration=126.915366995 podStartE2EDuration="2m6.915366995s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.914311814 +0000 UTC m=+148.009470020" watchObservedRunningTime="2025-10-10 06:26:40.915366995 +0000 UTC m=+148.010525191" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.917465 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-r2qgn" podStartSLOduration=7.917457826 podStartE2EDuration="7.917457826s" podCreationTimestamp="2025-10-10 06:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.893448827 +0000 UTC m=+147.988607033" watchObservedRunningTime="2025-10-10 06:26:40.917457826 +0000 UTC m=+148.012616022" Oct 10 06:26:40 crc kubenswrapper[4822]: I1010 06:26:40.965007 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:40 crc kubenswrapper[4822]: E1010 06:26:40.965748 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.465716541 +0000 UTC m=+148.560874737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.068156 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.068720 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.56870353 +0000 UTC m=+148.663861726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.111012 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:41 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:41 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:41 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.111111 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.169765 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.170292 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.670269398 +0000 UTC m=+148.765427584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.172482 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" podStartSLOduration=126.172462472 podStartE2EDuration="2m6.172462472s" podCreationTimestamp="2025-10-10 06:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:41.14116833 +0000 UTC m=+148.236326546" watchObservedRunningTime="2025-10-10 06:26:41.172462472 +0000 UTC m=+148.267620668" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.175622 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2pmgw" podStartSLOduration=127.175600223 podStartE2EDuration="2m7.175600223s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:40.945186263 +0000 UTC m=+148.040344469" watchObservedRunningTime="2025-10-10 06:26:41.175600223 +0000 UTC m=+148.270758419" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.271546 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.272114 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.772097102 +0000 UTC m=+148.867255298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.340022 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" podStartSLOduration=127.34000185 podStartE2EDuration="2m7.34000185s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:41.337072344 +0000 UTC m=+148.432230550" watchObservedRunningTime="2025-10-10 06:26:41.34000185 +0000 UTC m=+148.435160046" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.373262 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.373546 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.873498755 +0000 UTC m=+148.968656951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.373787 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.374242 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.874224706 +0000 UTC m=+148.969382902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.430692 4822 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pvwp8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.430762 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" podUID="b18ce54d-fc7a-46d0-a829-cb94946df57a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.449561 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2qgn" event={"ID":"9ea45453-41a2-41f9-b512-8184264743de","Type":"ContainerStarted","Data":"f97ab11a0909403b08d4182ef86d96f219d60ca119a46e6c17dd4ad11f3f4c3b"} Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.453264 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" event={"ID":"f5ada1bf-ca71-45b9-885f-9bd75ba7d400","Type":"ContainerStarted","Data":"9cafc33ee136a977aeaaff4aa256a0ce5d4f2372c6c3f7388200db58c8c355d7"} Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.472320 4822 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cvtdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.472431 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.475321 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.475559 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.975516666 +0000 UTC m=+149.070674862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.475708 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.476187 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:41.976170365 +0000 UTC m=+149.071328561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.482774 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q5kvv" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.576874 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.578769 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.078746912 +0000 UTC m=+149.173905108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.623084 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pvwp8" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.680580 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.681262 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.681394 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.683791 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.681437 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.684022 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.684668 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.184653656 +0000 UTC m=+149.279811852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.730011 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.730592 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.748893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.785251 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.786235 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.286204453 +0000 UTC m=+149.381362649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.888153 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.888677 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.388657267 +0000 UTC m=+149.483815463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.983536 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.989873 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.990254 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.490219354 +0000 UTC m=+149.585377560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:41 crc kubenswrapper[4822]: I1010 06:26:41.990447 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:41 crc kubenswrapper[4822]: E1010 06:26:41.990852 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.490833082 +0000 UTC m=+149.585991438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.003665 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.014532 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.091657 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.091860 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.591832073 +0000 UTC m=+149.686990269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.092131 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.092541 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.592532043 +0000 UTC m=+149.687690239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.105681 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:42 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:42 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:42 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.106345 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.193833 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.194308 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.694289367 +0000 UTC m=+149.789447563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.296173 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.296507 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.796493113 +0000 UTC m=+149.891651319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.392054 4822 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-n9rf4 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.392138 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" podUID="28d3fed7-bb74-4e55-a788-e354b2f0cd5c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.396949 4822 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-n9rf4 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.397038 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" podUID="28d3fed7-bb74-4e55-a788-e354b2f0cd5c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.398670 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.399092 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:42.89907332 +0000 UTC m=+149.994231516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.463040 4822 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pv22z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.463142 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.501156 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.501858 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.001832022 +0000 UTC m=+150.096990208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.518282 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6ndhv"] Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.519322 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.528060 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.580566 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6ndhv"] Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.623545 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.623957 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-utilities\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.624021 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whq8d\" (UniqueName: \"kubernetes.io/projected/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-kube-api-access-whq8d\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.624105 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-catalog-content\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.626455 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.126432021 +0000 UTC m=+150.221590217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.733383 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.733433 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-utilities\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.733463 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whq8d\" (UniqueName: \"kubernetes.io/projected/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-kube-api-access-whq8d\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.733519 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-catalog-content\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.734003 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-catalog-content\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.734298 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.234283221 +0000 UTC m=+150.329441417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.734686 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-utilities\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.739595 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fdk5h"] Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.741438 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.764930 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.765336 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdk5h"] Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.821568 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whq8d\" (UniqueName: \"kubernetes.io/projected/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-kube-api-access-whq8d\") pod \"certified-operators-6ndhv\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.836394 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.836598 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lkbd\" (UniqueName: \"kubernetes.io/projected/14d101d5-fd79-404a-9c5c-157e42608ae5-kube-api-access-7lkbd\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.836630 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-catalog-content\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.836652 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-utilities\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.836762 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.336745595 +0000 UTC m=+150.431903791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.856699 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.888295 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xc5gb"] Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.889418 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.903197 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xc5gb"] Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.937569 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-catalog-content\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.937621 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-utilities\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.937662 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.937721 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lkbd\" (UniqueName: \"kubernetes.io/projected/14d101d5-fd79-404a-9c5c-157e42608ae5-kube-api-access-7lkbd\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: E1010 06:26:42.938215 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.438202639 +0000 UTC m=+150.533360845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.938666 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-catalog-content\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.938706 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-utilities\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:42 crc kubenswrapper[4822]: I1010 06:26:42.968734 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lkbd\" (UniqueName: \"kubernetes.io/projected/14d101d5-fd79-404a-9c5c-157e42608ae5-kube-api-access-7lkbd\") pod \"community-operators-fdk5h\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.042310 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.042755 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn97s\" (UniqueName: \"kubernetes.io/projected/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-kube-api-access-sn97s\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.042820 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-catalog-content\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.042889 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-utilities\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.042977 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.54296274 +0000 UTC m=+150.638120936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.074549 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r42ph"] Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.077786 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.098264 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.119126 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:43 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:43 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:43 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.119187 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.120728 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r42ph"] Oct 10 06:26:43 crc kubenswrapper[4822]: W1010 06:26:43.136818 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b4bfd176419087ce07e73ddfefa1cbc6b4c829214f28e66c58e0c427abad9b37 WatchSource:0}: Error finding container b4bfd176419087ce07e73ddfefa1cbc6b4c829214f28e66c58e0c427abad9b37: Status 404 returned error can't find the container with id b4bfd176419087ce07e73ddfefa1cbc6b4c829214f28e66c58e0c427abad9b37 Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144194 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-catalog-content\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144259 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-utilities\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn97s\" (UniqueName: \"kubernetes.io/projected/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-kube-api-access-sn97s\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144326 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxpg8\" (UniqueName: \"kubernetes.io/projected/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-kube-api-access-cxpg8\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144357 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144380 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-catalog-content\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144403 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-utilities\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.144881 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-utilities\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.145136 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.645125305 +0000 UTC m=+150.740283501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.145491 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-catalog-content\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.193959 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn97s\" (UniqueName: \"kubernetes.io/projected/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-kube-api-access-sn97s\") pod \"certified-operators-xc5gb\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.238103 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.245099 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.245259 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-utilities\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.245336 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-catalog-content\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.246611 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxpg8\" (UniqueName: \"kubernetes.io/projected/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-kube-api-access-cxpg8\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.247738 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-utilities\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.247847 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.747829666 +0000 UTC m=+150.842987862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.248083 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-catalog-content\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.332046 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxpg8\" (UniqueName: \"kubernetes.io/projected/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-kube-api-access-cxpg8\") pod \"community-operators-r42ph\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.343913 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.353611 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.354233 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.854218394 +0000 UTC m=+150.949376590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.374583 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.374711 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.401815 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.402271 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.402438 4822 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-n9rf4 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.402491 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" podUID="28d3fed7-bb74-4e55-a788-e354b2f0cd5c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.438860 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.454457 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.455130 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c278601e-ee44-4ecf-9354-6eadee3d6788-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.455179 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c278601e-ee44-4ecf-9354-6eadee3d6788-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.455373 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:43.955354479 +0000 UTC m=+151.050512685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.526197 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" event={"ID":"f5ada1bf-ca71-45b9-885f-9bd75ba7d400","Type":"ContainerStarted","Data":"1ba88dc5d34f2e904a35f635a6c8bb9aefdfd32d82370372de7bd85c9e79bfc6"} Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.526261 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" event={"ID":"f5ada1bf-ca71-45b9-885f-9bd75ba7d400","Type":"ContainerStarted","Data":"08d2feac52dbe02a2df66aff44c454ba47b1c19f6b4b58d0f0f3509ee5015e57"} Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.532868 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1b9675aa7cd97c582e2eb6c59ed9784df844fbea5d2c42ea7044b3582611e030"} Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.534818 4822 generic.go:334] "Generic (PLEG): container finished" podID="2df5f09d-0d1c-40cf-9041-695d831d552d" containerID="4028de656a268672b0a3b3809074bd83441264c07fc5342630745c98b932cd3b" exitCode=0 Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.534889 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" event={"ID":"2df5f09d-0d1c-40cf-9041-695d831d552d","Type":"ContainerDied","Data":"4028de656a268672b0a3b3809074bd83441264c07fc5342630745c98b932cd3b"} Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.540206 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8397d2ab4f99dcb7610b609369622d09b22d80bafd32ca381bfee57200141e02"} Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.550603 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b4bfd176419087ce07e73ddfefa1cbc6b4c829214f28e66c58e0c427abad9b37"} Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.556909 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c278601e-ee44-4ecf-9354-6eadee3d6788-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.556981 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c278601e-ee44-4ecf-9354-6eadee3d6788-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.557020 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.557092 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c278601e-ee44-4ecf-9354-6eadee3d6788-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.557401 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.05738563 +0000 UTC m=+151.152543826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.637098 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c278601e-ee44-4ecf-9354-6eadee3d6788-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.658688 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.658844 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.158815244 +0000 UTC m=+151.253973450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.659329 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.659735 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.15972231 +0000 UTC m=+151.254880516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.728489 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.760940 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.761219 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.261058751 +0000 UTC m=+151.356216947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.762197 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.762931 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.262919065 +0000 UTC m=+151.358077261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.779002 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6ndhv"] Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.865396 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.866066 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.366036638 +0000 UTC m=+151.461194834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:43 crc kubenswrapper[4822]: I1010 06:26:43.967450 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:43 crc kubenswrapper[4822]: E1010 06:26:43.981429 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.481392697 +0000 UTC m=+151.576550893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.071516 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.071916 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.571900923 +0000 UTC m=+151.667059119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.107732 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:44 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:44 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:44 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.108117 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.133684 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdk5h"] Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.172551 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xc5gb"] Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.175316 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.175638 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.675624893 +0000 UTC m=+151.770783079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.269031 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r42ph"] Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.282211 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.282541 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.782521576 +0000 UTC m=+151.877679782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.321333 4822 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.383257 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.387260 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.387669 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.887649057 +0000 UTC m=+151.982807313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.405163 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n9rf4" Oct 10 06:26:44 crc kubenswrapper[4822]: W1010 06:26:44.463422 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc278601e_ee44_4ecf_9354_6eadee3d6788.slice/crio-4e421e7346682d5a44a37bde930f0674c1ed54cf99949c0cb0475c28495091da WatchSource:0}: Error finding container 4e421e7346682d5a44a37bde930f0674c1ed54cf99949c0cb0475c28495091da: Status 404 returned error can't find the container with id 4e421e7346682d5a44a37bde930f0674c1ed54cf99949c0cb0475c28495091da Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.488020 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.490515 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:44.990484852 +0000 UTC m=+152.085643048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.582981 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1c9b19fefdd7305769338e19c6ab03253f24f11d33c2abd928f4a4d1403f74b7"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.583304 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.589265 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.589571 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:45.089556107 +0000 UTC m=+152.184714303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.600640 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerStarted","Data":"b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.600696 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerStarted","Data":"8a8fe185a0ddd65fe01ee8f914ef170b552588c023d0ea7e6d6a132965c65c81"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.608206 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.609272 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerStarted","Data":"b5d6f8bd5fffbd7bdc00a7e93712079358b07196854b9ddd965966b3e916964a"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.609302 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerStarted","Data":"510e5bf3723c48756ddd8f775ca28addb869aa8bd96bded354d68ffc41e32572"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.615706 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"db04a27808b1c5969e16cec416ff21312eb6fef0207a6687c261109a90cdbb07"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.631119 4822 generic.go:334] "Generic (PLEG): container finished" podID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerID="825ec843bd69f3b6e04dd0e35407995c786ba6dd53a3fe81642dfc322247e64d" exitCode=0 Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.631242 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerDied","Data":"825ec843bd69f3b6e04dd0e35407995c786ba6dd53a3fe81642dfc322247e64d"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.631269 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerStarted","Data":"0e1a96eb4e07d2fa9039f0b0154e66feac7b0724e5c9dfe6d15e36f67243ec41"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.645443 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"00f1025de606a3fe7e1dbf4ce755de52a3c1c3a14eeb0d44dd60a7a166f5eabc"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.650818 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r42ph" event={"ID":"368e51fd-b3b3-48be-a3ef-dbdf3fb53295","Type":"ContainerStarted","Data":"4479d2486a51781138d36b996597fce95c9d2422981180d9a158576d8ddec3c6"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.651540 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c278601e-ee44-4ecf-9354-6eadee3d6788","Type":"ContainerStarted","Data":"4e421e7346682d5a44a37bde930f0674c1ed54cf99949c0cb0475c28495091da"} Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.661106 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d101d5_fd79_404a_9c5c_157e42608ae5.slice/crio-conmon-825ec843bd69f3b6e04dd0e35407995c786ba6dd53a3fe81642dfc322247e64d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc4f75b_d84a_4bfc_8ed2_11c70549f0db.slice/crio-b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b.scope\": RecentStats: unable to find data in memory cache]" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.690172 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.691986 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:26:45.191964739 +0000 UTC m=+152.287122935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.692023 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" event={"ID":"f5ada1bf-ca71-45b9-885f-9bd75ba7d400","Type":"ContainerStarted","Data":"3d75561b8392398d7be8609d54730ffa7a3d324130ea0c6b6d9804ecd838fa15"} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.708997 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j9mx4"] Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.710530 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.723111 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.731015 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9mx4"] Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.800729 4822 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-10T06:26:44.321354577Z","Handler":null,"Name":""} Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.801653 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-utilities\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.801723 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-catalog-content\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.801822 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.801874 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmgc\" (UniqueName: \"kubernetes.io/projected/929f6f10-eafc-40d0-9517-4cdd93f448ba-kube-api-access-vwmgc\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: E1010 06:26:44.802164 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:26:45.302150028 +0000 UTC m=+152.397308224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tvjnx" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.852889 4822 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.852948 4822 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.880523 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ftmzs" podStartSLOduration=11.880499218 podStartE2EDuration="11.880499218s" podCreationTimestamp="2025-10-10 06:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:44.876472281 +0000 UTC m=+151.971630487" watchObservedRunningTime="2025-10-10 06:26:44.880499218 +0000 UTC m=+151.975657414" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.905055 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.905260 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmgc\" (UniqueName: \"kubernetes.io/projected/929f6f10-eafc-40d0-9517-4cdd93f448ba-kube-api-access-vwmgc\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.905306 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-utilities\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.905338 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-catalog-content\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.905756 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-catalog-content\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.905984 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-utilities\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.938221 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 06:26:44 crc kubenswrapper[4822]: I1010 06:26:44.963314 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmgc\" (UniqueName: \"kubernetes.io/projected/929f6f10-eafc-40d0-9517-4cdd93f448ba-kube-api-access-vwmgc\") pod \"redhat-marketplace-j9mx4\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.006614 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.015319 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.015360 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.042609 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.051375 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6sw5h"] Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.052368 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.060873 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sw5h"] Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.101892 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:45 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:45 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:45 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.102183 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.107794 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-catalog-content\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.107903 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-utilities\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.107992 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqz7c\" (UniqueName: \"kubernetes.io/projected/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-kube-api-access-pqz7c\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.108412 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tvjnx\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.208212 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.209896 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqz7c\" (UniqueName: \"kubernetes.io/projected/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-kube-api-access-pqz7c\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.209997 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-catalog-content\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.210028 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-utilities\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.211209 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-utilities\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.213760 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-catalog-content\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.232227 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqz7c\" (UniqueName: \"kubernetes.io/projected/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-kube-api-access-pqz7c\") pod \"redhat-marketplace-6sw5h\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.311101 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume\") pod \"2df5f09d-0d1c-40cf-9041-695d831d552d\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.311155 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume\") pod \"2df5f09d-0d1c-40cf-9041-695d831d552d\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.311233 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvh7d\" (UniqueName: \"kubernetes.io/projected/2df5f09d-0d1c-40cf-9041-695d831d552d-kube-api-access-mvh7d\") pod \"2df5f09d-0d1c-40cf-9041-695d831d552d\" (UID: \"2df5f09d-0d1c-40cf-9041-695d831d552d\") " Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.313318 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2df5f09d-0d1c-40cf-9041-695d831d552d" (UID: "2df5f09d-0d1c-40cf-9041-695d831d552d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.316138 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df5f09d-0d1c-40cf-9041-695d831d552d-kube-api-access-mvh7d" (OuterVolumeSpecName: "kube-api-access-mvh7d") pod "2df5f09d-0d1c-40cf-9041-695d831d552d" (UID: "2df5f09d-0d1c-40cf-9041-695d831d552d"). InnerVolumeSpecName "kube-api-access-mvh7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.317578 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2df5f09d-0d1c-40cf-9041-695d831d552d" (UID: "2df5f09d-0d1c-40cf-9041-695d831d552d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.385136 4822 patch_prober.go:28] interesting pod/downloads-7954f5f757-tr8l7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.385203 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tr8l7" podUID="6e88bcbb-f0a7-4837-96e0-6ced47adb39a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.385252 4822 patch_prober.go:28] interesting pod/downloads-7954f5f757-tr8l7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.385276 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tr8l7" podUID="6e88bcbb-f0a7-4837-96e0-6ced47adb39a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.411769 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.412497 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvh7d\" (UniqueName: \"kubernetes.io/projected/2df5f09d-0d1c-40cf-9041-695d831d552d-kube-api-access-mvh7d\") on node \"crc\" DevicePath \"\"" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.412532 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2df5f09d-0d1c-40cf-9041-695d831d552d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.412545 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2df5f09d-0d1c-40cf-9041-695d831d552d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.527608 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.563003 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.568971 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kznjd" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.572914 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.604141 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.610815 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9mx4"] Oct 10 06:26:45 crc kubenswrapper[4822]: W1010 06:26:45.618249 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929f6f10_eafc_40d0_9517_4cdd93f448ba.slice/crio-1a04bc11456fed84f67b8c5c586d2b61b20a88f2d42d4c937ce3e3d08453f910 WatchSource:0}: Error finding container 1a04bc11456fed84f67b8c5c586d2b61b20a88f2d42d4c937ce3e3d08453f910: Status 404 returned error can't find the container with id 1a04bc11456fed84f67b8c5c586d2b61b20a88f2d42d4c937ce3e3d08453f910 Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.624033 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p8fht" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.701630 4822 patch_prober.go:28] interesting pod/console-f9d7485db-kvjlx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.701707 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kvjlx" podUID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.728662 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.729674 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.729707 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.729720 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jcm6q"] Oct 10 06:26:45 crc kubenswrapper[4822]: E1010 06:26:45.729994 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df5f09d-0d1c-40cf-9041-695d831d552d" containerName="collect-profiles" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.730009 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df5f09d-0d1c-40cf-9041-695d831d552d" containerName="collect-profiles" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.730169 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df5f09d-0d1c-40cf-9041-695d831d552d" containerName="collect-profiles" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.731885 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jcm6q"] Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.732006 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.737368 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.741468 4822 generic.go:334] "Generic (PLEG): container finished" podID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerID="b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b" exitCode=0 Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.741568 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerDied","Data":"b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b"} Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.744666 4822 generic.go:334] "Generic (PLEG): container finished" podID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerID="b5d6f8bd5fffbd7bdc00a7e93712079358b07196854b9ddd965966b3e916964a" exitCode=0 Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.744732 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerDied","Data":"b5d6f8bd5fffbd7bdc00a7e93712079358b07196854b9ddd965966b3e916964a"} Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.780716 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9mx4" event={"ID":"929f6f10-eafc-40d0-9517-4cdd93f448ba","Type":"ContainerStarted","Data":"1a04bc11456fed84f67b8c5c586d2b61b20a88f2d42d4c937ce3e3d08453f910"} Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.782829 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" event={"ID":"2df5f09d-0d1c-40cf-9041-695d831d552d","Type":"ContainerDied","Data":"225d8beb42cfca01c1e0224c37c872598e6b92cd2ef71a1012c568ba82c82f1c"} Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.782865 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225d8beb42cfca01c1e0224c37c872598e6b92cd2ef71a1012c568ba82c82f1c" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.782933 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.787029 4822 generic.go:334] "Generic (PLEG): container finished" podID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerID="dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e" exitCode=0 Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.787351 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r42ph" event={"ID":"368e51fd-b3b3-48be-a3ef-dbdf3fb53295","Type":"ContainerDied","Data":"dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e"} Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.815617 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tvjnx"] Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.820340 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-utilities\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.820544 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-catalog-content\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.820589 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5r4f\" (UniqueName: \"kubernetes.io/projected/35d359f6-a748-4388-92f2-497f21cca720-kube-api-access-j5r4f\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: W1010 06:26:45.820976 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27ef059_d8bc_44a1_8940_bcb6031a72b1.slice/crio-8d223c4b68eaabf9a38886a2037280780e75d0b60999738079bad38058f72b5e WatchSource:0}: Error finding container 8d223c4b68eaabf9a38886a2037280780e75d0b60999738079bad38058f72b5e: Status 404 returned error can't find the container with id 8d223c4b68eaabf9a38886a2037280780e75d0b60999738079bad38058f72b5e Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.825927 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c278601e-ee44-4ecf-9354-6eadee3d6788","Type":"ContainerStarted","Data":"8662a97462adce48db1b4b2393c017a4105ba5603046da7c5ddeb4353558c363"} Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.925519 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-catalog-content\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.925570 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5r4f\" (UniqueName: \"kubernetes.io/projected/35d359f6-a748-4388-92f2-497f21cca720-kube-api-access-j5r4f\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.925659 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-utilities\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.929293 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-catalog-content\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.934396 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-utilities\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:45 crc kubenswrapper[4822]: I1010 06:26:45.962301 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5r4f\" (UniqueName: \"kubernetes.io/projected/35d359f6-a748-4388-92f2-497f21cca720-kube-api-access-j5r4f\") pod \"redhat-operators-jcm6q\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.065987 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sw5h"] Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.073773 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvxbr"] Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.079419 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.088192 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvxbr"] Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.108362 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.111739 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.115356 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:46 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:46 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:46 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.115396 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.128463 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rf6b\" (UniqueName: \"kubernetes.io/projected/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-kube-api-access-8rf6b\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.128533 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-utilities\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.128607 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-catalog-content\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.207746 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.232215 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-utilities\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.232316 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-catalog-content\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.232419 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rf6b\" (UniqueName: \"kubernetes.io/projected/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-kube-api-access-8rf6b\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.233396 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-utilities\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.234402 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-catalog-content\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.248065 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9rx2j" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.272656 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rf6b\" (UniqueName: \"kubernetes.io/projected/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-kube-api-access-8rf6b\") pod \"redhat-operators-bvxbr\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.337022 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.500066 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.774135 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvxbr"] Oct 10 06:26:46 crc kubenswrapper[4822]: W1010 06:26:46.784683 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda691ce25_89c5_4ed2_85d2_8ce11aa62b81.slice/crio-8002ea36940b79e243643788ad4125cb95c058c51ec8f728d7ca601c8c52a00d WatchSource:0}: Error finding container 8002ea36940b79e243643788ad4125cb95c058c51ec8f728d7ca601c8c52a00d: Status 404 returned error can't find the container with id 8002ea36940b79e243643788ad4125cb95c058c51ec8f728d7ca601c8c52a00d Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.845879 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jcm6q"] Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.857179 4822 generic.go:334] "Generic (PLEG): container finished" podID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerID="6b01067c9e4cc89b921f20f4c2950d062b6e93c886cc55bb0f526df4c4b6b613" exitCode=0 Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.857281 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9mx4" event={"ID":"929f6f10-eafc-40d0-9517-4cdd93f448ba","Type":"ContainerDied","Data":"6b01067c9e4cc89b921f20f4c2950d062b6e93c886cc55bb0f526df4c4b6b613"} Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.863900 4822 generic.go:334] "Generic (PLEG): container finished" podID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerID="98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac" exitCode=0 Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.863990 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sw5h" event={"ID":"f345d3f4-b350-4740-8aa3-c1b2d4dac32a","Type":"ContainerDied","Data":"98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac"} Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.864058 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sw5h" event={"ID":"f345d3f4-b350-4740-8aa3-c1b2d4dac32a","Type":"ContainerStarted","Data":"4513a7086376f9de20cb4009eca21e072f75dad22abfecb131ac6501ef3b3054"} Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.868659 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" event={"ID":"c27ef059-d8bc-44a1-8940-bcb6031a72b1","Type":"ContainerStarted","Data":"857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef"} Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.868753 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" event={"ID":"c27ef059-d8bc-44a1-8940-bcb6031a72b1","Type":"ContainerStarted","Data":"8d223c4b68eaabf9a38886a2037280780e75d0b60999738079bad38058f72b5e"} Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.869317 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.871665 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerStarted","Data":"8002ea36940b79e243643788ad4125cb95c058c51ec8f728d7ca601c8c52a00d"} Oct 10 06:26:46 crc kubenswrapper[4822]: W1010 06:26:46.874025 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d359f6_a748_4388_92f2_497f21cca720.slice/crio-c2053e34a710aa03c5bb3b721dc399faf4057ee0be715821b82b8ba4f0632500 WatchSource:0}: Error finding container c2053e34a710aa03c5bb3b721dc399faf4057ee0be715821b82b8ba4f0632500: Status 404 returned error can't find the container with id c2053e34a710aa03c5bb3b721dc399faf4057ee0be715821b82b8ba4f0632500 Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.880263 4822 generic.go:334] "Generic (PLEG): container finished" podID="c278601e-ee44-4ecf-9354-6eadee3d6788" containerID="8662a97462adce48db1b4b2393c017a4105ba5603046da7c5ddeb4353558c363" exitCode=0 Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.880380 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c278601e-ee44-4ecf-9354-6eadee3d6788","Type":"ContainerDied","Data":"8662a97462adce48db1b4b2393c017a4105ba5603046da7c5ddeb4353558c363"} Oct 10 06:26:46 crc kubenswrapper[4822]: I1010 06:26:46.940522 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" podStartSLOduration=132.940506016 podStartE2EDuration="2m12.940506016s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:26:46.938484387 +0000 UTC m=+154.033642603" watchObservedRunningTime="2025-10-10 06:26:46.940506016 +0000 UTC m=+154.035664212" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.103819 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:47 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:47 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:47 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.104206 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.223216 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.254156 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c278601e-ee44-4ecf-9354-6eadee3d6788-kube-api-access\") pod \"c278601e-ee44-4ecf-9354-6eadee3d6788\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.254239 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c278601e-ee44-4ecf-9354-6eadee3d6788-kubelet-dir\") pod \"c278601e-ee44-4ecf-9354-6eadee3d6788\" (UID: \"c278601e-ee44-4ecf-9354-6eadee3d6788\") " Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.254521 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c278601e-ee44-4ecf-9354-6eadee3d6788-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c278601e-ee44-4ecf-9354-6eadee3d6788" (UID: "c278601e-ee44-4ecf-9354-6eadee3d6788"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.258032 4822 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c278601e-ee44-4ecf-9354-6eadee3d6788-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.261982 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c278601e-ee44-4ecf-9354-6eadee3d6788-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c278601e-ee44-4ecf-9354-6eadee3d6788" (UID: "c278601e-ee44-4ecf-9354-6eadee3d6788"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.359471 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c278601e-ee44-4ecf-9354-6eadee3d6788-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.913467 4822 generic.go:334] "Generic (PLEG): container finished" podID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerID="6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc" exitCode=0 Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.913523 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerDied","Data":"6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc"} Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.917755 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c278601e-ee44-4ecf-9354-6eadee3d6788","Type":"ContainerDied","Data":"4e421e7346682d5a44a37bde930f0674c1ed54cf99949c0cb0475c28495091da"} Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.917794 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e421e7346682d5a44a37bde930f0674c1ed54cf99949c0cb0475c28495091da" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.917899 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.922241 4822 generic.go:334] "Generic (PLEG): container finished" podID="35d359f6-a748-4388-92f2-497f21cca720" containerID="fd887a5df83904b93e58886b65109b85f8175c37973a492f1deef0ac727bdb30" exitCode=0 Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.922914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcm6q" event={"ID":"35d359f6-a748-4388-92f2-497f21cca720","Type":"ContainerDied","Data":"fd887a5df83904b93e58886b65109b85f8175c37973a492f1deef0ac727bdb30"} Oct 10 06:26:47 crc kubenswrapper[4822]: I1010 06:26:47.922976 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcm6q" event={"ID":"35d359f6-a748-4388-92f2-497f21cca720","Type":"ContainerStarted","Data":"c2053e34a710aa03c5bb3b721dc399faf4057ee0be715821b82b8ba4f0632500"} Oct 10 06:26:48 crc kubenswrapper[4822]: I1010 06:26:48.099443 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:48 crc kubenswrapper[4822]: [-]has-synced failed: reason withheld Oct 10 06:26:48 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:48 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:48 crc kubenswrapper[4822]: I1010 06:26:48.099522 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.101912 4822 patch_prober.go:28] interesting pod/router-default-5444994796-dpxkz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:26:49 crc kubenswrapper[4822]: [+]has-synced ok Oct 10 06:26:49 crc kubenswrapper[4822]: [+]process-running ok Oct 10 06:26:49 crc kubenswrapper[4822]: healthz check failed Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.101973 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dpxkz" podUID="dc8a3486-2bb5-49fd-99ed-09a9e743932c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.243233 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 10 06:26:49 crc kubenswrapper[4822]: E1010 06:26:49.243529 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c278601e-ee44-4ecf-9354-6eadee3d6788" containerName="pruner" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.243543 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c278601e-ee44-4ecf-9354-6eadee3d6788" containerName="pruner" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.247775 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c278601e-ee44-4ecf-9354-6eadee3d6788" containerName="pruner" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.248378 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.253362 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.253689 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.255323 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.296960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e60dd97-0ab5-454e-a265-cbc07592ad35-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.297140 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e60dd97-0ab5-454e-a265-cbc07592ad35-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.405540 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e60dd97-0ab5-454e-a265-cbc07592ad35-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.405596 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e60dd97-0ab5-454e-a265-cbc07592ad35-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.405620 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e60dd97-0ab5-454e-a265-cbc07592ad35-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.444959 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e60dd97-0ab5-454e-a265-cbc07592ad35-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.585693 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:26:49 crc kubenswrapper[4822]: I1010 06:26:49.939422 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 10 06:26:49 crc kubenswrapper[4822]: W1010 06:26:49.954444 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7e60dd97_0ab5_454e_a265_cbc07592ad35.slice/crio-ecbd54097424b6dac953dca04c91c41c5b0f2d58fcc3bc3d80ff1ae3e9768812 WatchSource:0}: Error finding container ecbd54097424b6dac953dca04c91c41c5b0f2d58fcc3bc3d80ff1ae3e9768812: Status 404 returned error can't find the container with id ecbd54097424b6dac953dca04c91c41c5b0f2d58fcc3bc3d80ff1ae3e9768812 Oct 10 06:26:50 crc kubenswrapper[4822]: I1010 06:26:50.102820 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:50 crc kubenswrapper[4822]: I1010 06:26:50.108531 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dpxkz" Oct 10 06:26:50 crc kubenswrapper[4822]: I1010 06:26:50.951445 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7e60dd97-0ab5-454e-a265-cbc07592ad35","Type":"ContainerStarted","Data":"ecbd54097424b6dac953dca04c91c41c5b0f2d58fcc3bc3d80ff1ae3e9768812"} Oct 10 06:26:51 crc kubenswrapper[4822]: I1010 06:26:51.642414 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-r2qgn" Oct 10 06:26:51 crc kubenswrapper[4822]: I1010 06:26:51.966058 4822 generic.go:334] "Generic (PLEG): container finished" podID="7e60dd97-0ab5-454e-a265-cbc07592ad35" containerID="ca913f96911716c3fb3f3972eadf8a418267b50a1d370c68695cb6900b19b14c" exitCode=0 Oct 10 06:26:51 crc kubenswrapper[4822]: I1010 06:26:51.966137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7e60dd97-0ab5-454e-a265-cbc07592ad35","Type":"ContainerDied","Data":"ca913f96911716c3fb3f3972eadf8a418267b50a1d370c68695cb6900b19b14c"} Oct 10 06:26:55 crc kubenswrapper[4822]: I1010 06:26:55.389204 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tr8l7" Oct 10 06:26:55 crc kubenswrapper[4822]: I1010 06:26:55.724037 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:55 crc kubenswrapper[4822]: I1010 06:26:55.730177 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:26:56 crc kubenswrapper[4822]: I1010 06:26:56.968700 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:56 crc kubenswrapper[4822]: I1010 06:26:56.974164 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a5c431a-2c94-41ca-aba2-c7a04c4908db-metrics-certs\") pod \"network-metrics-daemon-25l92\" (UID: \"8a5c431a-2c94-41ca-aba2-c7a04c4908db\") " pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:26:57 crc kubenswrapper[4822]: I1010 06:26:57.069143 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-25l92" Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.405253 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.517595 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e60dd97-0ab5-454e-a265-cbc07592ad35-kube-api-access\") pod \"7e60dd97-0ab5-454e-a265-cbc07592ad35\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.517994 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e60dd97-0ab5-454e-a265-cbc07592ad35-kubelet-dir\") pod \"7e60dd97-0ab5-454e-a265-cbc07592ad35\" (UID: \"7e60dd97-0ab5-454e-a265-cbc07592ad35\") " Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.518094 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e60dd97-0ab5-454e-a265-cbc07592ad35-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7e60dd97-0ab5-454e-a265-cbc07592ad35" (UID: "7e60dd97-0ab5-454e-a265-cbc07592ad35"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.518792 4822 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e60dd97-0ab5-454e-a265-cbc07592ad35-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.523736 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e60dd97-0ab5-454e-a265-cbc07592ad35-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7e60dd97-0ab5-454e-a265-cbc07592ad35" (UID: "7e60dd97-0ab5-454e-a265-cbc07592ad35"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:27:00 crc kubenswrapper[4822]: I1010 06:27:00.620524 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e60dd97-0ab5-454e-a265-cbc07592ad35-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:01 crc kubenswrapper[4822]: I1010 06:27:01.046575 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7e60dd97-0ab5-454e-a265-cbc07592ad35","Type":"ContainerDied","Data":"ecbd54097424b6dac953dca04c91c41c5b0f2d58fcc3bc3d80ff1ae3e9768812"} Oct 10 06:27:01 crc kubenswrapper[4822]: I1010 06:27:01.046639 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:27:01 crc kubenswrapper[4822]: I1010 06:27:01.046650 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecbd54097424b6dac953dca04c91c41c5b0f2d58fcc3bc3d80ff1ae3e9768812" Oct 10 06:27:01 crc kubenswrapper[4822]: I1010 06:27:01.336969 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:27:01 crc kubenswrapper[4822]: I1010 06:27:01.337047 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:27:05 crc kubenswrapper[4822]: I1010 06:27:05.417088 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:27:14 crc kubenswrapper[4822]: E1010 06:27:14.797456 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 10 06:27:14 crc kubenswrapper[4822]: E1010 06:27:14.798156 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j5r4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jcm6q_openshift-marketplace(35d359f6-a748-4388-92f2-497f21cca720): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:27:14 crc kubenswrapper[4822]: E1010 06:27:14.799361 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jcm6q" podUID="35d359f6-a748-4388-92f2-497f21cca720" Oct 10 06:27:15 crc kubenswrapper[4822]: E1010 06:27:15.810166 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jcm6q" podUID="35d359f6-a748-4388-92f2-497f21cca720" Oct 10 06:27:16 crc kubenswrapper[4822]: I1010 06:27:16.167886 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7trts" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.458593 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.458772 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwmgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j9mx4_openshift-marketplace(929f6f10-eafc-40d0-9517-4cdd93f448ba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.460257 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j9mx4" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.529701 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.530060 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lkbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fdk5h_openshift-marketplace(14d101d5-fd79-404a-9c5c-157e42608ae5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.531513 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fdk5h" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.555209 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.555343 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whq8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6ndhv_openshift-marketplace(ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:27:17 crc kubenswrapper[4822]: E1010 06:27:17.556964 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6ndhv" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" Oct 10 06:27:17 crc kubenswrapper[4822]: I1010 06:27:17.844929 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-25l92"] Oct 10 06:27:17 crc kubenswrapper[4822]: W1010 06:27:17.850521 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a5c431a_2c94_41ca_aba2_c7a04c4908db.slice/crio-b5ca2d8935350b38cf163baa510947e2f344df838f4b5d19cf48322175fd0fb3 WatchSource:0}: Error finding container b5ca2d8935350b38cf163baa510947e2f344df838f4b5d19cf48322175fd0fb3: Status 404 returned error can't find the container with id b5ca2d8935350b38cf163baa510947e2f344df838f4b5d19cf48322175fd0fb3 Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.149210 4822 generic.go:334] "Generic (PLEG): container finished" podID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerID="9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b" exitCode=0 Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.149327 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r42ph" event={"ID":"368e51fd-b3b3-48be-a3ef-dbdf3fb53295","Type":"ContainerDied","Data":"9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b"} Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.152614 4822 generic.go:334] "Generic (PLEG): container finished" podID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerID="250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866" exitCode=0 Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.152694 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerDied","Data":"250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866"} Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.162380 4822 generic.go:334] "Generic (PLEG): container finished" podID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerID="99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0" exitCode=0 Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.162471 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sw5h" event={"ID":"f345d3f4-b350-4740-8aa3-c1b2d4dac32a","Type":"ContainerDied","Data":"99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0"} Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.172685 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25l92" event={"ID":"8a5c431a-2c94-41ca-aba2-c7a04c4908db","Type":"ContainerStarted","Data":"d856ac454cf7ecc6f6971651aadcddc69ae210d85e5af7f314f1662a33ff5d96"} Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.172727 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25l92" event={"ID":"8a5c431a-2c94-41ca-aba2-c7a04c4908db","Type":"ContainerStarted","Data":"b5ca2d8935350b38cf163baa510947e2f344df838f4b5d19cf48322175fd0fb3"} Oct 10 06:27:18 crc kubenswrapper[4822]: I1010 06:27:18.177730 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerStarted","Data":"6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12"} Oct 10 06:27:18 crc kubenswrapper[4822]: E1010 06:27:18.185058 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j9mx4" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" Oct 10 06:27:18 crc kubenswrapper[4822]: E1010 06:27:18.188462 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6ndhv" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" Oct 10 06:27:18 crc kubenswrapper[4822]: E1010 06:27:18.188628 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fdk5h" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.184723 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r42ph" event={"ID":"368e51fd-b3b3-48be-a3ef-dbdf3fb53295","Type":"ContainerStarted","Data":"88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144"} Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.188388 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerStarted","Data":"66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da"} Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.190931 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sw5h" event={"ID":"f345d3f4-b350-4740-8aa3-c1b2d4dac32a","Type":"ContainerStarted","Data":"526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e"} Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.192935 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-25l92" event={"ID":"8a5c431a-2c94-41ca-aba2-c7a04c4908db","Type":"ContainerStarted","Data":"3d1a71ed1b325d834a87142f0c60aeb09a662dd0aeef1e9836cad93b893cdbe5"} Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.194502 4822 generic.go:334] "Generic (PLEG): container finished" podID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerID="6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12" exitCode=0 Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.194533 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerDied","Data":"6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12"} Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.210373 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r42ph" podStartSLOduration=3.2260759820000002 podStartE2EDuration="36.210346794s" podCreationTimestamp="2025-10-10 06:26:43 +0000 UTC" firstStartedPulling="2025-10-10 06:26:45.813385344 +0000 UTC m=+152.908543540" lastFinishedPulling="2025-10-10 06:27:18.797656116 +0000 UTC m=+185.892814352" observedRunningTime="2025-10-10 06:27:19.206915084 +0000 UTC m=+186.302073290" watchObservedRunningTime="2025-10-10 06:27:19.210346794 +0000 UTC m=+186.305505000" Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.244175 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xc5gb" podStartSLOduration=3.055396083 podStartE2EDuration="37.244156829s" podCreationTimestamp="2025-10-10 06:26:42 +0000 UTC" firstStartedPulling="2025-10-10 06:26:44.607898441 +0000 UTC m=+151.703056637" lastFinishedPulling="2025-10-10 06:27:18.796659187 +0000 UTC m=+185.891817383" observedRunningTime="2025-10-10 06:27:19.242288684 +0000 UTC m=+186.337446900" watchObservedRunningTime="2025-10-10 06:27:19.244156829 +0000 UTC m=+186.339315035" Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.272128 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-25l92" podStartSLOduration=165.272113193 podStartE2EDuration="2m45.272113193s" podCreationTimestamp="2025-10-10 06:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:27:19.271101293 +0000 UTC m=+186.366259519" watchObservedRunningTime="2025-10-10 06:27:19.272113193 +0000 UTC m=+186.367271389" Oct 10 06:27:19 crc kubenswrapper[4822]: I1010 06:27:19.273026 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6sw5h" podStartSLOduration=2.444514883 podStartE2EDuration="34.273018899s" podCreationTimestamp="2025-10-10 06:26:45 +0000 UTC" firstStartedPulling="2025-10-10 06:26:46.889186092 +0000 UTC m=+153.984344288" lastFinishedPulling="2025-10-10 06:27:18.717690068 +0000 UTC m=+185.812848304" observedRunningTime="2025-10-10 06:27:19.258602729 +0000 UTC m=+186.353760925" watchObservedRunningTime="2025-10-10 06:27:19.273018899 +0000 UTC m=+186.368177085" Oct 10 06:27:20 crc kubenswrapper[4822]: I1010 06:27:20.208137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerStarted","Data":"6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1"} Oct 10 06:27:20 crc kubenswrapper[4822]: I1010 06:27:20.234496 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvxbr" podStartSLOduration=2.500929426 podStartE2EDuration="34.234478857s" podCreationTimestamp="2025-10-10 06:26:46 +0000 UTC" firstStartedPulling="2025-10-10 06:26:47.915345824 +0000 UTC m=+155.010504020" lastFinishedPulling="2025-10-10 06:27:19.648895235 +0000 UTC m=+186.744053451" observedRunningTime="2025-10-10 06:27:20.229593855 +0000 UTC m=+187.324752071" watchObservedRunningTime="2025-10-10 06:27:20.234478857 +0000 UTC m=+187.329637053" Oct 10 06:27:22 crc kubenswrapper[4822]: I1010 06:27:22.008784 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:27:23 crc kubenswrapper[4822]: I1010 06:27:23.239569 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:27:23 crc kubenswrapper[4822]: I1010 06:27:23.239638 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:27:23 crc kubenswrapper[4822]: I1010 06:27:23.440182 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:27:23 crc kubenswrapper[4822]: I1010 06:27:23.440237 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:27:23 crc kubenswrapper[4822]: I1010 06:27:23.523728 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:27:23 crc kubenswrapper[4822]: I1010 06:27:23.527635 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:27:24 crc kubenswrapper[4822]: I1010 06:27:24.271024 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:27:24 crc kubenswrapper[4822]: I1010 06:27:24.295958 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:27:25 crc kubenswrapper[4822]: I1010 06:27:25.528291 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:27:25 crc kubenswrapper[4822]: I1010 06:27:25.530256 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:27:25 crc kubenswrapper[4822]: I1010 06:27:25.568952 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.280971 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.503085 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.503150 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.543047 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.578824 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r42ph"] Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.579072 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r42ph" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="registry-server" containerID="cri-o://88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144" gracePeriod=2 Oct 10 06:27:26 crc kubenswrapper[4822]: I1010 06:27:26.902559 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.034436 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-utilities\") pod \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.034591 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-catalog-content\") pod \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.034639 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxpg8\" (UniqueName: \"kubernetes.io/projected/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-kube-api-access-cxpg8\") pod \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\" (UID: \"368e51fd-b3b3-48be-a3ef-dbdf3fb53295\") " Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.035542 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-utilities" (OuterVolumeSpecName: "utilities") pod "368e51fd-b3b3-48be-a3ef-dbdf3fb53295" (UID: "368e51fd-b3b3-48be-a3ef-dbdf3fb53295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.046922 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-kube-api-access-cxpg8" (OuterVolumeSpecName: "kube-api-access-cxpg8") pod "368e51fd-b3b3-48be-a3ef-dbdf3fb53295" (UID: "368e51fd-b3b3-48be-a3ef-dbdf3fb53295"). InnerVolumeSpecName "kube-api-access-cxpg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.092296 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "368e51fd-b3b3-48be-a3ef-dbdf3fb53295" (UID: "368e51fd-b3b3-48be-a3ef-dbdf3fb53295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.136248 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.136328 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxpg8\" (UniqueName: \"kubernetes.io/projected/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-kube-api-access-cxpg8\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.136345 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/368e51fd-b3b3-48be-a3ef-dbdf3fb53295-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.250847 4822 generic.go:334] "Generic (PLEG): container finished" podID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerID="88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144" exitCode=0 Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.250904 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r42ph" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.250962 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r42ph" event={"ID":"368e51fd-b3b3-48be-a3ef-dbdf3fb53295","Type":"ContainerDied","Data":"88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144"} Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.251015 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r42ph" event={"ID":"368e51fd-b3b3-48be-a3ef-dbdf3fb53295","Type":"ContainerDied","Data":"4479d2486a51781138d36b996597fce95c9d2422981180d9a158576d8ddec3c6"} Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.251059 4822 scope.go:117] "RemoveContainer" containerID="88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.277175 4822 scope.go:117] "RemoveContainer" containerID="9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.283461 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r42ph"] Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.287362 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r42ph"] Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.300935 4822 scope.go:117] "RemoveContainer" containerID="dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.316772 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.330790 4822 scope.go:117] "RemoveContainer" containerID="88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144" Oct 10 06:27:27 crc kubenswrapper[4822]: E1010 06:27:27.331512 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144\": container with ID starting with 88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144 not found: ID does not exist" containerID="88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.331567 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144"} err="failed to get container status \"88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144\": rpc error: code = NotFound desc = could not find container \"88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144\": container with ID starting with 88feba6ff19969e9196f3012a3a4462ccc2862a8e5578aeae7b8f650ccfe1144 not found: ID does not exist" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.331626 4822 scope.go:117] "RemoveContainer" containerID="9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b" Oct 10 06:27:27 crc kubenswrapper[4822]: E1010 06:27:27.332084 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b\": container with ID starting with 9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b not found: ID does not exist" containerID="9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.332105 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b"} err="failed to get container status \"9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b\": rpc error: code = NotFound desc = could not find container \"9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b\": container with ID starting with 9c41289ca3dc24241180d56dd8e59f9beb73514fb8c98b8c5578e9fcd5d1d88b not found: ID does not exist" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.332126 4822 scope.go:117] "RemoveContainer" containerID="dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e" Oct 10 06:27:27 crc kubenswrapper[4822]: E1010 06:27:27.332489 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e\": container with ID starting with dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e not found: ID does not exist" containerID="dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.332549 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e"} err="failed to get container status \"dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e\": rpc error: code = NotFound desc = could not find container \"dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e\": container with ID starting with dabc5e2a7dc7411039fe64c80abc66a4e3bb16d9f63e2b562d96713286657d6e not found: ID does not exist" Oct 10 06:27:27 crc kubenswrapper[4822]: I1010 06:27:27.657756 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" path="/var/lib/kubelet/pods/368e51fd-b3b3-48be-a3ef-dbdf3fb53295/volumes" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.381056 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xc5gb"] Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.381333 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xc5gb" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="registry-server" containerID="cri-o://66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da" gracePeriod=2 Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.737388 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.857794 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn97s\" (UniqueName: \"kubernetes.io/projected/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-kube-api-access-sn97s\") pod \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.857890 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-utilities\") pod \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.857918 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-catalog-content\") pod \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\" (UID: \"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db\") " Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.859241 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-utilities" (OuterVolumeSpecName: "utilities") pod "3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" (UID: "3bc4f75b-d84a-4bfc-8ed2-11c70549f0db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.869185 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-kube-api-access-sn97s" (OuterVolumeSpecName: "kube-api-access-sn97s") pod "3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" (UID: "3bc4f75b-d84a-4bfc-8ed2-11c70549f0db"). InnerVolumeSpecName "kube-api-access-sn97s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.921996 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" (UID: "3bc4f75b-d84a-4bfc-8ed2-11c70549f0db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.959388 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn97s\" (UniqueName: \"kubernetes.io/projected/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-kube-api-access-sn97s\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.959442 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.959460 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:28 crc kubenswrapper[4822]: I1010 06:27:28.984110 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sw5h"] Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.266611 4822 generic.go:334] "Generic (PLEG): container finished" podID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerID="66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da" exitCode=0 Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.266696 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xc5gb" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.266710 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerDied","Data":"66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da"} Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.267192 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xc5gb" event={"ID":"3bc4f75b-d84a-4bfc-8ed2-11c70549f0db","Type":"ContainerDied","Data":"8a8fe185a0ddd65fe01ee8f914ef170b552588c023d0ea7e6d6a132965c65c81"} Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.267214 4822 scope.go:117] "RemoveContainer" containerID="66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.267348 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6sw5h" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="registry-server" containerID="cri-o://526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e" gracePeriod=2 Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.291498 4822 scope.go:117] "RemoveContainer" containerID="250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.305254 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xc5gb"] Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.315246 4822 scope.go:117] "RemoveContainer" containerID="b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.316448 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xc5gb"] Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.382956 4822 scope.go:117] "RemoveContainer" containerID="66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da" Oct 10 06:27:29 crc kubenswrapper[4822]: E1010 06:27:29.383672 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da\": container with ID starting with 66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da not found: ID does not exist" containerID="66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.383754 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da"} err="failed to get container status \"66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da\": rpc error: code = NotFound desc = could not find container \"66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da\": container with ID starting with 66f2863c497035b53d9703d2b0e668a69fdf90df123f9ce01d22777c25a524da not found: ID does not exist" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.383896 4822 scope.go:117] "RemoveContainer" containerID="250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866" Oct 10 06:27:29 crc kubenswrapper[4822]: E1010 06:27:29.384349 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866\": container with ID starting with 250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866 not found: ID does not exist" containerID="250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.384397 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866"} err="failed to get container status \"250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866\": rpc error: code = NotFound desc = could not find container \"250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866\": container with ID starting with 250840f4c305d4418df44e78f7d90afa235b733e697966d74fd597427d668866 not found: ID does not exist" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.384424 4822 scope.go:117] "RemoveContainer" containerID="b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b" Oct 10 06:27:29 crc kubenswrapper[4822]: E1010 06:27:29.384730 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b\": container with ID starting with b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b not found: ID does not exist" containerID="b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.384872 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b"} err="failed to get container status \"b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b\": rpc error: code = NotFound desc = could not find container \"b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b\": container with ID starting with b1090512b2e8823e5b5d2f45f5a6e843c055a19297b5161d02e1f1ddb66cfd8b not found: ID does not exist" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.567981 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.659603 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" path="/var/lib/kubelet/pods/3bc4f75b-d84a-4bfc-8ed2-11c70549f0db/volumes" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.679744 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-catalog-content\") pod \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.679845 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqz7c\" (UniqueName: \"kubernetes.io/projected/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-kube-api-access-pqz7c\") pod \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.679875 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-utilities\") pod \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\" (UID: \"f345d3f4-b350-4740-8aa3-c1b2d4dac32a\") " Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.683093 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-utilities" (OuterVolumeSpecName: "utilities") pod "f345d3f4-b350-4740-8aa3-c1b2d4dac32a" (UID: "f345d3f4-b350-4740-8aa3-c1b2d4dac32a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.683645 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.691639 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-kube-api-access-pqz7c" (OuterVolumeSpecName: "kube-api-access-pqz7c") pod "f345d3f4-b350-4740-8aa3-c1b2d4dac32a" (UID: "f345d3f4-b350-4740-8aa3-c1b2d4dac32a"). InnerVolumeSpecName "kube-api-access-pqz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.696738 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f345d3f4-b350-4740-8aa3-c1b2d4dac32a" (UID: "f345d3f4-b350-4740-8aa3-c1b2d4dac32a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.785189 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqz7c\" (UniqueName: \"kubernetes.io/projected/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-kube-api-access-pqz7c\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:29 crc kubenswrapper[4822]: I1010 06:27:29.785266 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f345d3f4-b350-4740-8aa3-c1b2d4dac32a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.277246 4822 generic.go:334] "Generic (PLEG): container finished" podID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerID="526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e" exitCode=0 Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.277313 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sw5h" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.277344 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sw5h" event={"ID":"f345d3f4-b350-4740-8aa3-c1b2d4dac32a","Type":"ContainerDied","Data":"526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e"} Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.277697 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sw5h" event={"ID":"f345d3f4-b350-4740-8aa3-c1b2d4dac32a","Type":"ContainerDied","Data":"4513a7086376f9de20cb4009eca21e072f75dad22abfecb131ac6501ef3b3054"} Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.277721 4822 scope.go:117] "RemoveContainer" containerID="526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.294214 4822 scope.go:117] "RemoveContainer" containerID="99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.309597 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sw5h"] Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.312136 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sw5h"] Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.326743 4822 scope.go:117] "RemoveContainer" containerID="98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.343408 4822 scope.go:117] "RemoveContainer" containerID="526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e" Oct 10 06:27:30 crc kubenswrapper[4822]: E1010 06:27:30.343823 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e\": container with ID starting with 526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e not found: ID does not exist" containerID="526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.343889 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e"} err="failed to get container status \"526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e\": rpc error: code = NotFound desc = could not find container \"526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e\": container with ID starting with 526b388caefd9e44ac95e5abe4cdde4a1dfb57d6777090fb6bf26db216d9555e not found: ID does not exist" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.343927 4822 scope.go:117] "RemoveContainer" containerID="99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0" Oct 10 06:27:30 crc kubenswrapper[4822]: E1010 06:27:30.344260 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0\": container with ID starting with 99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0 not found: ID does not exist" containerID="99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.344298 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0"} err="failed to get container status \"99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0\": rpc error: code = NotFound desc = could not find container \"99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0\": container with ID starting with 99c307c7246470d9361300fd8d86df1420864a444144a9f97dda9993261e49c0 not found: ID does not exist" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.344346 4822 scope.go:117] "RemoveContainer" containerID="98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac" Oct 10 06:27:30 crc kubenswrapper[4822]: E1010 06:27:30.344838 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac\": container with ID starting with 98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac not found: ID does not exist" containerID="98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.344924 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac"} err="failed to get container status \"98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac\": rpc error: code = NotFound desc = could not find container \"98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac\": container with ID starting with 98b17eee90f0887cd7ad841a69ce9fe46d5ff55d20c650d17b1a582a3be433ac not found: ID does not exist" Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.779152 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvxbr"] Oct 10 06:27:30 crc kubenswrapper[4822]: I1010 06:27:30.779422 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvxbr" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="registry-server" containerID="cri-o://6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1" gracePeriod=2 Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.126989 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.202373 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-catalog-content\") pod \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.202456 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rf6b\" (UniqueName: \"kubernetes.io/projected/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-kube-api-access-8rf6b\") pod \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.202475 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-utilities\") pod \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\" (UID: \"a691ce25-89c5-4ed2-85d2-8ce11aa62b81\") " Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.203969 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-utilities" (OuterVolumeSpecName: "utilities") pod "a691ce25-89c5-4ed2-85d2-8ce11aa62b81" (UID: "a691ce25-89c5-4ed2-85d2-8ce11aa62b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.214062 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-kube-api-access-8rf6b" (OuterVolumeSpecName: "kube-api-access-8rf6b") pod "a691ce25-89c5-4ed2-85d2-8ce11aa62b81" (UID: "a691ce25-89c5-4ed2-85d2-8ce11aa62b81"). InnerVolumeSpecName "kube-api-access-8rf6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.286793 4822 generic.go:334] "Generic (PLEG): container finished" podID="35d359f6-a748-4388-92f2-497f21cca720" containerID="8dc97415c655b1481305faf39d5fac27194d36ef7dc2f2e18282c2a5a42c5614" exitCode=0 Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.286831 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcm6q" event={"ID":"35d359f6-a748-4388-92f2-497f21cca720","Type":"ContainerDied","Data":"8dc97415c655b1481305faf39d5fac27194d36ef7dc2f2e18282c2a5a42c5614"} Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.293018 4822 generic.go:334] "Generic (PLEG): container finished" podID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerID="6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1" exitCode=0 Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.293068 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerDied","Data":"6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1"} Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.293089 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvxbr" event={"ID":"a691ce25-89c5-4ed2-85d2-8ce11aa62b81","Type":"ContainerDied","Data":"8002ea36940b79e243643788ad4125cb95c058c51ec8f728d7ca601c8c52a00d"} Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.293106 4822 scope.go:117] "RemoveContainer" containerID="6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.293173 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvxbr" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.295037 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a691ce25-89c5-4ed2-85d2-8ce11aa62b81" (UID: "a691ce25-89c5-4ed2-85d2-8ce11aa62b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.304423 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rf6b\" (UniqueName: \"kubernetes.io/projected/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-kube-api-access-8rf6b\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.304466 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.304483 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a691ce25-89c5-4ed2-85d2-8ce11aa62b81-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.310756 4822 scope.go:117] "RemoveContainer" containerID="6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.339770 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.339858 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.341314 4822 scope.go:117] "RemoveContainer" containerID="6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.361718 4822 scope.go:117] "RemoveContainer" containerID="6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1" Oct 10 06:27:31 crc kubenswrapper[4822]: E1010 06:27:31.362122 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1\": container with ID starting with 6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1 not found: ID does not exist" containerID="6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.362156 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1"} err="failed to get container status \"6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1\": rpc error: code = NotFound desc = could not find container \"6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1\": container with ID starting with 6d367857bbd28dea554ade9647e2c093ecafbdeb8426d9f93e1c2ecb0cfbf7f1 not found: ID does not exist" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.362185 4822 scope.go:117] "RemoveContainer" containerID="6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12" Oct 10 06:27:31 crc kubenswrapper[4822]: E1010 06:27:31.362443 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12\": container with ID starting with 6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12 not found: ID does not exist" containerID="6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.362472 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12"} err="failed to get container status \"6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12\": rpc error: code = NotFound desc = could not find container \"6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12\": container with ID starting with 6a47ba63bbbf740c38194bccfe07b1f7f1a08aef75223e48465b86cd880c0a12 not found: ID does not exist" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.362491 4822 scope.go:117] "RemoveContainer" containerID="6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc" Oct 10 06:27:31 crc kubenswrapper[4822]: E1010 06:27:31.362903 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc\": container with ID starting with 6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc not found: ID does not exist" containerID="6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.362929 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc"} err="failed to get container status \"6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc\": rpc error: code = NotFound desc = could not find container \"6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc\": container with ID starting with 6e80c07b496874962793a64ea8343c6f6f391e221db15219610564b3c33e53cc not found: ID does not exist" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.618003 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvxbr"] Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.622373 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvxbr"] Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.660038 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" path="/var/lib/kubelet/pods/a691ce25-89c5-4ed2-85d2-8ce11aa62b81/volumes" Oct 10 06:27:31 crc kubenswrapper[4822]: I1010 06:27:31.661396 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" path="/var/lib/kubelet/pods/f345d3f4-b350-4740-8aa3-c1b2d4dac32a/volumes" Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.304672 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcm6q" event={"ID":"35d359f6-a748-4388-92f2-497f21cca720","Type":"ContainerStarted","Data":"33a507a0e13ce77fe120279090db238ed8801316cd65b4a34718b11ad7ae6986"} Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.306958 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerStarted","Data":"1c64c0d1996880b66145e5297dca4266df4ec140ce59e6511d3699cd447a37d6"} Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.309346 4822 generic.go:334] "Generic (PLEG): container finished" podID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerID="2b147d90ee0eb587f684606327f53e2344806f07030864e5b21923066653e9e1" exitCode=0 Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.309430 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerDied","Data":"2b147d90ee0eb587f684606327f53e2344806f07030864e5b21923066653e9e1"} Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.314927 4822 generic.go:334] "Generic (PLEG): container finished" podID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerID="becece30a8d5758c96991bf8469d9db283b6795184df0df4248fdb7a0d8033b7" exitCode=0 Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.314965 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9mx4" event={"ID":"929f6f10-eafc-40d0-9517-4cdd93f448ba","Type":"ContainerDied","Data":"becece30a8d5758c96991bf8469d9db283b6795184df0df4248fdb7a0d8033b7"} Oct 10 06:27:32 crc kubenswrapper[4822]: I1010 06:27:32.326933 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jcm6q" podStartSLOduration=3.51685369 podStartE2EDuration="47.326912866s" podCreationTimestamp="2025-10-10 06:26:45 +0000 UTC" firstStartedPulling="2025-10-10 06:26:47.92450331 +0000 UTC m=+155.019661506" lastFinishedPulling="2025-10-10 06:27:31.734562486 +0000 UTC m=+198.829720682" observedRunningTime="2025-10-10 06:27:32.323660756 +0000 UTC m=+199.418818962" watchObservedRunningTime="2025-10-10 06:27:32.326912866 +0000 UTC m=+199.422071062" Oct 10 06:27:33 crc kubenswrapper[4822]: I1010 06:27:33.321714 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerStarted","Data":"900eaec8b37e97d4ffb775bbe75695003980af3972fc02cf8e705b5f79acfe47"} Oct 10 06:27:33 crc kubenswrapper[4822]: I1010 06:27:33.323731 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9mx4" event={"ID":"929f6f10-eafc-40d0-9517-4cdd93f448ba","Type":"ContainerStarted","Data":"b98de16f7b117fbb72dbe83ca81b7eeb1c86d79691728e95a1038790e9db21e7"} Oct 10 06:27:33 crc kubenswrapper[4822]: I1010 06:27:33.326648 4822 generic.go:334] "Generic (PLEG): container finished" podID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerID="1c64c0d1996880b66145e5297dca4266df4ec140ce59e6511d3699cd447a37d6" exitCode=0 Oct 10 06:27:33 crc kubenswrapper[4822]: I1010 06:27:33.326686 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerDied","Data":"1c64c0d1996880b66145e5297dca4266df4ec140ce59e6511d3699cd447a37d6"} Oct 10 06:27:33 crc kubenswrapper[4822]: I1010 06:27:33.338361 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6ndhv" podStartSLOduration=3.215270714 podStartE2EDuration="51.33834216s" podCreationTimestamp="2025-10-10 06:26:42 +0000 UTC" firstStartedPulling="2025-10-10 06:26:44.610661432 +0000 UTC m=+151.705819628" lastFinishedPulling="2025-10-10 06:27:32.733732878 +0000 UTC m=+199.828891074" observedRunningTime="2025-10-10 06:27:33.338141894 +0000 UTC m=+200.433300090" watchObservedRunningTime="2025-10-10 06:27:33.33834216 +0000 UTC m=+200.433500356" Oct 10 06:27:33 crc kubenswrapper[4822]: I1010 06:27:33.375648 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j9mx4" podStartSLOduration=3.5819035489999997 podStartE2EDuration="49.375627386s" podCreationTimestamp="2025-10-10 06:26:44 +0000 UTC" firstStartedPulling="2025-10-10 06:26:46.897256037 +0000 UTC m=+153.992414233" lastFinishedPulling="2025-10-10 06:27:32.690979874 +0000 UTC m=+199.786138070" observedRunningTime="2025-10-10 06:27:33.374576734 +0000 UTC m=+200.469734950" watchObservedRunningTime="2025-10-10 06:27:33.375627386 +0000 UTC m=+200.470785582" Oct 10 06:27:35 crc kubenswrapper[4822]: I1010 06:27:35.043362 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:27:35 crc kubenswrapper[4822]: I1010 06:27:35.043713 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:27:35 crc kubenswrapper[4822]: I1010 06:27:35.096573 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:27:36 crc kubenswrapper[4822]: I1010 06:27:36.110667 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:27:36 crc kubenswrapper[4822]: I1010 06:27:36.112453 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:27:37 crc kubenswrapper[4822]: I1010 06:27:37.158894 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jcm6q" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="registry-server" probeResult="failure" output=< Oct 10 06:27:37 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 06:27:37 crc kubenswrapper[4822]: > Oct 10 06:27:38 crc kubenswrapper[4822]: I1010 06:27:38.350415 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerStarted","Data":"76a1e5e483a6ea1d2237086cd980fcd80162acd3258b6a82fbbe3bbebe5621d9"} Oct 10 06:27:42 crc kubenswrapper[4822]: I1010 06:27:42.858265 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:27:42 crc kubenswrapper[4822]: I1010 06:27:42.860921 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:27:42 crc kubenswrapper[4822]: I1010 06:27:42.902238 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:27:42 crc kubenswrapper[4822]: I1010 06:27:42.937381 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fdk5h" podStartSLOduration=8.327481548 podStartE2EDuration="1m0.937359588s" podCreationTimestamp="2025-10-10 06:26:42 +0000 UTC" firstStartedPulling="2025-10-10 06:26:44.635466414 +0000 UTC m=+151.730624610" lastFinishedPulling="2025-10-10 06:27:37.245344434 +0000 UTC m=+204.340502650" observedRunningTime="2025-10-10 06:27:38.370480327 +0000 UTC m=+205.465638543" watchObservedRunningTime="2025-10-10 06:27:42.937359588 +0000 UTC m=+210.032517784" Oct 10 06:27:43 crc kubenswrapper[4822]: I1010 06:27:43.099087 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:27:43 crc kubenswrapper[4822]: I1010 06:27:43.099354 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:27:43 crc kubenswrapper[4822]: I1010 06:27:43.146933 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:27:43 crc kubenswrapper[4822]: I1010 06:27:43.412845 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:27:43 crc kubenswrapper[4822]: I1010 06:27:43.415747 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:27:45 crc kubenswrapper[4822]: I1010 06:27:45.113829 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:27:46 crc kubenswrapper[4822]: I1010 06:27:46.158776 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:27:46 crc kubenswrapper[4822]: I1010 06:27:46.202088 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:27:55 crc kubenswrapper[4822]: I1010 06:27:55.564604 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pv22z"] Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.336910 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.337284 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.337336 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.337918 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.337973 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf" gracePeriod=600 Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.497179 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf" exitCode=0 Oct 10 06:28:01 crc kubenswrapper[4822]: I1010 06:28:01.497350 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf"} Oct 10 06:28:02 crc kubenswrapper[4822]: I1010 06:28:02.505202 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"655b42e8ceedf984b323c00ca6aee28025c1d29693f1d01235c31b26947c6c14"} Oct 10 06:28:20 crc kubenswrapper[4822]: I1010 06:28:20.593633 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" containerID="cri-o://af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402" gracePeriod=15 Oct 10 06:28:20 crc kubenswrapper[4822]: I1010 06:28:20.967367 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003188 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2"] Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003483 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003521 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003536 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003545 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003556 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003564 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003599 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003609 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003622 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003629 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003638 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003646 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003678 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003687 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003695 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003702 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003711 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003718 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003729 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e60dd97-0ab5-454e-a265-cbc07592ad35" containerName="pruner" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003735 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e60dd97-0ab5-454e-a265-cbc07592ad35" containerName="pruner" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003768 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003776 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="extract-utilities" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003788 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003795 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003828 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003835 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.003847 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003852 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="extract-content" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003966 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a691ce25-89c5-4ed2-85d2-8ce11aa62b81" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003976 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="368e51fd-b3b3-48be-a3ef-dbdf3fb53295" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003984 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerName="oauth-openshift" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.003992 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc4f75b-d84a-4bfc-8ed2-11c70549f0db" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.004000 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e60dd97-0ab5-454e-a265-cbc07592ad35" containerName="pruner" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.004006 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f345d3f4-b350-4740-8aa3-c1b2d4dac32a" containerName="registry-server" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.004439 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.018413 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2"] Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038254 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038324 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038355 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038418 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa42a5e-dcc0-4467-9ff8-c57f85187537-audit-dir\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038573 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4kk\" (UniqueName: \"kubernetes.io/projected/efa42a5e-dcc0-4467-9ff8-c57f85187537-kube-api-access-ll4kk\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038611 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-error\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.038916 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039042 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039084 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039141 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-audit-policies\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039304 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-login\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039336 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.039360 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-session\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.139968 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-idp-0-file-data\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140088 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-provider-selection\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140149 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-router-certs\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140178 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-cliconfig\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140213 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxjm\" (UniqueName: \"kubernetes.io/projected/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-kube-api-access-vpxjm\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140233 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-policies\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140250 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-service-ca\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140269 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-session\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140288 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-error\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140307 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-ocp-branding-template\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140351 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-serving-cert\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140391 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-trusted-ca-bundle\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140419 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-dir\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140439 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-login\") pod \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\" (UID: \"a3f9e4cd-d614-4a34-9c5f-c097103e65fc\") " Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140574 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140602 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140630 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140665 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140694 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-audit-policies\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140720 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-login\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140739 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140757 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-session\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140776 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140798 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140838 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140868 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa42a5e-dcc0-4467-9ff8-c57f85187537-audit-dir\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140886 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4kk\" (UniqueName: \"kubernetes.io/projected/efa42a5e-dcc0-4467-9ff8-c57f85187537-kube-api-access-ll4kk\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.140907 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-error\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.141014 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.141507 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.141636 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.143259 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.143326 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efa42a5e-dcc0-4467-9ff8-c57f85187537-audit-dir\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.143791 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.144010 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.144470 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-audit-policies\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.144588 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.145662 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.146904 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.146941 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-login\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.146974 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.148018 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-session\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.148481 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.149467 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.149499 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.149558 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.149784 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.149919 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-kube-api-access-vpxjm" (OuterVolumeSpecName: "kube-api-access-vpxjm") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "kube-api-access-vpxjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.149964 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.150163 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.150410 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.150519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.152150 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efa42a5e-dcc0-4467-9ff8-c57f85187537-v4-0-config-user-template-error\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.153105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.154981 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a3f9e4cd-d614-4a34-9c5f-c097103e65fc" (UID: "a3f9e4cd-d614-4a34-9c5f-c097103e65fc"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.158204 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4kk\" (UniqueName: \"kubernetes.io/projected/efa42a5e-dcc0-4467-9ff8-c57f85187537-kube-api-access-ll4kk\") pod \"oauth-openshift-5dcd86cbbd-lrvd2\" (UID: \"efa42a5e-dcc0-4467-9ff8-c57f85187537\") " pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241227 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241270 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241282 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241295 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241306 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxjm\" (UniqueName: \"kubernetes.io/projected/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-kube-api-access-vpxjm\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241317 4822 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241358 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241380 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241392 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241402 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241413 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241424 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241432 4822 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.241441 4822 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3f9e4cd-d614-4a34-9c5f-c097103e65fc-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.328046 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.614651 4822 generic.go:334] "Generic (PLEG): container finished" podID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" containerID="af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402" exitCode=0 Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.614707 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" event={"ID":"a3f9e4cd-d614-4a34-9c5f-c097103e65fc","Type":"ContainerDied","Data":"af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402"} Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.614733 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.614765 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pv22z" event={"ID":"a3f9e4cd-d614-4a34-9c5f-c097103e65fc","Type":"ContainerDied","Data":"1da5f62b7473da8e63bea5ce42df4bfee11642f53fb0c76379c0e69de6d5e807"} Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.614790 4822 scope.go:117] "RemoveContainer" containerID="af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.632160 4822 scope.go:117] "RemoveContainer" containerID="af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402" Oct 10 06:28:21 crc kubenswrapper[4822]: E1010 06:28:21.632798 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402\": container with ID starting with af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402 not found: ID does not exist" containerID="af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.632904 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402"} err="failed to get container status \"af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402\": rpc error: code = NotFound desc = could not find container \"af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402\": container with ID starting with af2c02cc1a2fa72e99857db1572458c19c971f93ec1c7064e292b516ca117402 not found: ID does not exist" Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.646622 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pv22z"] Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.657217 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pv22z"] Oct 10 06:28:21 crc kubenswrapper[4822]: I1010 06:28:21.759183 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2"] Oct 10 06:28:22 crc kubenswrapper[4822]: I1010 06:28:22.627618 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" event={"ID":"efa42a5e-dcc0-4467-9ff8-c57f85187537","Type":"ContainerStarted","Data":"8c60ba5e316f0c0343c629a2aa487d1fb4cfa115ed3ca259942c22ebde2aa137"} Oct 10 06:28:22 crc kubenswrapper[4822]: I1010 06:28:22.628046 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:22 crc kubenswrapper[4822]: I1010 06:28:22.628059 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" event={"ID":"efa42a5e-dcc0-4467-9ff8-c57f85187537","Type":"ContainerStarted","Data":"12143986f94af8a2bd52228eb8f420750ca22a98f4ece33f977d75d3c60e6c0d"} Oct 10 06:28:22 crc kubenswrapper[4822]: I1010 06:28:22.633593 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" Oct 10 06:28:22 crc kubenswrapper[4822]: I1010 06:28:22.658073 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5dcd86cbbd-lrvd2" podStartSLOduration=27.658043451 podStartE2EDuration="27.658043451s" podCreationTimestamp="2025-10-10 06:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:28:22.648724897 +0000 UTC m=+249.743883113" watchObservedRunningTime="2025-10-10 06:28:22.658043451 +0000 UTC m=+249.753201657" Oct 10 06:28:23 crc kubenswrapper[4822]: I1010 06:28:23.667559 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f9e4cd-d614-4a34-9c5f-c097103e65fc" path="/var/lib/kubelet/pods/a3f9e4cd-d614-4a34-9c5f-c097103e65fc/volumes" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.463981 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6ndhv"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.465219 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6ndhv" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="registry-server" containerID="cri-o://900eaec8b37e97d4ffb775bbe75695003980af3972fc02cf8e705b5f79acfe47" gracePeriod=30 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.468245 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdk5h"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.468495 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fdk5h" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="registry-server" containerID="cri-o://76a1e5e483a6ea1d2237086cd980fcd80162acd3258b6a82fbbe3bbebe5621d9" gracePeriod=30 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.475118 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvtdr"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.475332 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" containerID="cri-o://f05936b518bd89d54b5b85413c66e23cffeac395745311b2c6bde21d5ee7a831" gracePeriod=30 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.488453 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9mx4"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.488775 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j9mx4" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="registry-server" containerID="cri-o://b98de16f7b117fbb72dbe83ca81b7eeb1c86d79691728e95a1038790e9db21e7" gracePeriod=30 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.496660 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jcm6q"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.497045 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jcm6q" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="registry-server" containerID="cri-o://33a507a0e13ce77fe120279090db238ed8801316cd65b4a34718b11ad7ae6986" gracePeriod=30 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.499457 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7zj69"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.500157 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.513744 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7zj69"] Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.690909 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.690959 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffwpg\" (UniqueName: \"kubernetes.io/projected/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-kube-api-access-ffwpg\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.691009 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.699580 4822 generic.go:334] "Generic (PLEG): container finished" podID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerID="900eaec8b37e97d4ffb775bbe75695003980af3972fc02cf8e705b5f79acfe47" exitCode=0 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.699661 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerDied","Data":"900eaec8b37e97d4ffb775bbe75695003980af3972fc02cf8e705b5f79acfe47"} Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.704274 4822 generic.go:334] "Generic (PLEG): container finished" podID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerID="b98de16f7b117fbb72dbe83ca81b7eeb1c86d79691728e95a1038790e9db21e7" exitCode=0 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.704350 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9mx4" event={"ID":"929f6f10-eafc-40d0-9517-4cdd93f448ba","Type":"ContainerDied","Data":"b98de16f7b117fbb72dbe83ca81b7eeb1c86d79691728e95a1038790e9db21e7"} Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.707651 4822 generic.go:334] "Generic (PLEG): container finished" podID="a176335f-a8bb-476a-bc6d-540be238a200" containerID="f05936b518bd89d54b5b85413c66e23cffeac395745311b2c6bde21d5ee7a831" exitCode=0 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.707700 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" event={"ID":"a176335f-a8bb-476a-bc6d-540be238a200","Type":"ContainerDied","Data":"f05936b518bd89d54b5b85413c66e23cffeac395745311b2c6bde21d5ee7a831"} Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.710317 4822 generic.go:334] "Generic (PLEG): container finished" podID="35d359f6-a748-4388-92f2-497f21cca720" containerID="33a507a0e13ce77fe120279090db238ed8801316cd65b4a34718b11ad7ae6986" exitCode=0 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.710358 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcm6q" event={"ID":"35d359f6-a748-4388-92f2-497f21cca720","Type":"ContainerDied","Data":"33a507a0e13ce77fe120279090db238ed8801316cd65b4a34718b11ad7ae6986"} Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.712932 4822 generic.go:334] "Generic (PLEG): container finished" podID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerID="76a1e5e483a6ea1d2237086cd980fcd80162acd3258b6a82fbbe3bbebe5621d9" exitCode=0 Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.712969 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerDied","Data":"76a1e5e483a6ea1d2237086cd980fcd80162acd3258b6a82fbbe3bbebe5621d9"} Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.792918 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.793234 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.793256 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffwpg\" (UniqueName: \"kubernetes.io/projected/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-kube-api-access-ffwpg\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.794712 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.803777 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.814209 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffwpg\" (UniqueName: \"kubernetes.io/projected/8a112c7a-6133-483e-b34c-f12bfcd7a4ac-kube-api-access-ffwpg\") pod \"marketplace-operator-79b997595-7zj69\" (UID: \"8a112c7a-6133-483e-b34c-f12bfcd7a4ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.823304 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.844549 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.884701 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.889648 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.895141 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.905088 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996604 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-operator-metrics\") pod \"a176335f-a8bb-476a-bc6d-540be238a200\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996656 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-trusted-ca\") pod \"a176335f-a8bb-476a-bc6d-540be238a200\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996705 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-utilities\") pod \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996752 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-catalog-content\") pod \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996785 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-catalog-content\") pod \"14d101d5-fd79-404a-9c5c-157e42608ae5\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996838 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-utilities\") pod \"35d359f6-a748-4388-92f2-497f21cca720\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996865 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5r4f\" (UniqueName: \"kubernetes.io/projected/35d359f6-a748-4388-92f2-497f21cca720-kube-api-access-j5r4f\") pod \"35d359f6-a748-4388-92f2-497f21cca720\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996888 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzqrz\" (UniqueName: \"kubernetes.io/projected/a176335f-a8bb-476a-bc6d-540be238a200-kube-api-access-dzqrz\") pod \"a176335f-a8bb-476a-bc6d-540be238a200\" (UID: \"a176335f-a8bb-476a-bc6d-540be238a200\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996907 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-catalog-content\") pod \"35d359f6-a748-4388-92f2-497f21cca720\" (UID: \"35d359f6-a748-4388-92f2-497f21cca720\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996950 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-utilities\") pod \"14d101d5-fd79-404a-9c5c-157e42608ae5\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.996984 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whq8d\" (UniqueName: \"kubernetes.io/projected/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-kube-api-access-whq8d\") pod \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\" (UID: \"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa\") " Oct 10 06:28:33 crc kubenswrapper[4822]: I1010 06:28:33.997015 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lkbd\" (UniqueName: \"kubernetes.io/projected/14d101d5-fd79-404a-9c5c-157e42608ae5-kube-api-access-7lkbd\") pod \"14d101d5-fd79-404a-9c5c-157e42608ae5\" (UID: \"14d101d5-fd79-404a-9c5c-157e42608ae5\") " Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:33.998013 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-utilities" (OuterVolumeSpecName: "utilities") pod "35d359f6-a748-4388-92f2-497f21cca720" (UID: "35d359f6-a748-4388-92f2-497f21cca720"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:33.998973 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-utilities" (OuterVolumeSpecName: "utilities") pod "ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" (UID: "ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:33.999581 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a176335f-a8bb-476a-bc6d-540be238a200" (UID: "a176335f-a8bb-476a-bc6d-540be238a200"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:33.999807 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-utilities" (OuterVolumeSpecName: "utilities") pod "14d101d5-fd79-404a-9c5c-157e42608ae5" (UID: "14d101d5-fd79-404a-9c5c-157e42608ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.014112 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-kube-api-access-whq8d" (OuterVolumeSpecName: "kube-api-access-whq8d") pod "ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" (UID: "ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa"). InnerVolumeSpecName "kube-api-access-whq8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.014107 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d101d5-fd79-404a-9c5c-157e42608ae5-kube-api-access-7lkbd" (OuterVolumeSpecName: "kube-api-access-7lkbd") pod "14d101d5-fd79-404a-9c5c-157e42608ae5" (UID: "14d101d5-fd79-404a-9c5c-157e42608ae5"). InnerVolumeSpecName "kube-api-access-7lkbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.014896 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a176335f-a8bb-476a-bc6d-540be238a200-kube-api-access-dzqrz" (OuterVolumeSpecName: "kube-api-access-dzqrz") pod "a176335f-a8bb-476a-bc6d-540be238a200" (UID: "a176335f-a8bb-476a-bc6d-540be238a200"). InnerVolumeSpecName "kube-api-access-dzqrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.018264 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a176335f-a8bb-476a-bc6d-540be238a200" (UID: "a176335f-a8bb-476a-bc6d-540be238a200"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.020289 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d359f6-a748-4388-92f2-497f21cca720-kube-api-access-j5r4f" (OuterVolumeSpecName: "kube-api-access-j5r4f") pod "35d359f6-a748-4388-92f2-497f21cca720" (UID: "35d359f6-a748-4388-92f2-497f21cca720"). InnerVolumeSpecName "kube-api-access-j5r4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.047287 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" (UID: "ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.063288 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14d101d5-fd79-404a-9c5c-157e42608ae5" (UID: "14d101d5-fd79-404a-9c5c-157e42608ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.098335 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-utilities\") pod \"929f6f10-eafc-40d0-9517-4cdd93f448ba\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.098746 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-catalog-content\") pod \"929f6f10-eafc-40d0-9517-4cdd93f448ba\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.098908 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwmgc\" (UniqueName: \"kubernetes.io/projected/929f6f10-eafc-40d0-9517-4cdd93f448ba-kube-api-access-vwmgc\") pod \"929f6f10-eafc-40d0-9517-4cdd93f448ba\" (UID: \"929f6f10-eafc-40d0-9517-4cdd93f448ba\") " Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099103 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099121 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5r4f\" (UniqueName: \"kubernetes.io/projected/35d359f6-a748-4388-92f2-497f21cca720-kube-api-access-j5r4f\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099134 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzqrz\" (UniqueName: \"kubernetes.io/projected/a176335f-a8bb-476a-bc6d-540be238a200-kube-api-access-dzqrz\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099143 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099154 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whq8d\" (UniqueName: \"kubernetes.io/projected/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-kube-api-access-whq8d\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099163 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lkbd\" (UniqueName: \"kubernetes.io/projected/14d101d5-fd79-404a-9c5c-157e42608ae5-kube-api-access-7lkbd\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099175 4822 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099184 4822 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a176335f-a8bb-476a-bc6d-540be238a200-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099193 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099200 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099208 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d101d5-fd79-404a-9c5c-157e42608ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.099363 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-utilities" (OuterVolumeSpecName: "utilities") pod "929f6f10-eafc-40d0-9517-4cdd93f448ba" (UID: "929f6f10-eafc-40d0-9517-4cdd93f448ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.102179 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929f6f10-eafc-40d0-9517-4cdd93f448ba-kube-api-access-vwmgc" (OuterVolumeSpecName: "kube-api-access-vwmgc") pod "929f6f10-eafc-40d0-9517-4cdd93f448ba" (UID: "929f6f10-eafc-40d0-9517-4cdd93f448ba"). InnerVolumeSpecName "kube-api-access-vwmgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.112754 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35d359f6-a748-4388-92f2-497f21cca720" (UID: "35d359f6-a748-4388-92f2-497f21cca720"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.120873 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "929f6f10-eafc-40d0-9517-4cdd93f448ba" (UID: "929f6f10-eafc-40d0-9517-4cdd93f448ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.200773 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwmgc\" (UniqueName: \"kubernetes.io/projected/929f6f10-eafc-40d0-9517-4cdd93f448ba-kube-api-access-vwmgc\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.200841 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.200852 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/929f6f10-eafc-40d0-9517-4cdd93f448ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.200860 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35d359f6-a748-4388-92f2-497f21cca720-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.251822 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7zj69"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.721312 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9mx4" event={"ID":"929f6f10-eafc-40d0-9517-4cdd93f448ba","Type":"ContainerDied","Data":"1a04bc11456fed84f67b8c5c586d2b61b20a88f2d42d4c937ce3e3d08453f910"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.721743 4822 scope.go:117] "RemoveContainer" containerID="b98de16f7b117fbb72dbe83ca81b7eeb1c86d79691728e95a1038790e9db21e7" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.721560 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9mx4" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.724490 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" event={"ID":"a176335f-a8bb-476a-bc6d-540be238a200","Type":"ContainerDied","Data":"df7a00a5a1bff4a0a6be0b4aceab3bccce85610d3555ea187ce4945e5e86c3fc"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.724636 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cvtdr" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.728678 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcm6q" event={"ID":"35d359f6-a748-4388-92f2-497f21cca720","Type":"ContainerDied","Data":"c2053e34a710aa03c5bb3b721dc399faf4057ee0be715821b82b8ba4f0632500"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.728790 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcm6q" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.733785 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdk5h" event={"ID":"14d101d5-fd79-404a-9c5c-157e42608ae5","Type":"ContainerDied","Data":"0e1a96eb4e07d2fa9039f0b0154e66feac7b0724e5c9dfe6d15e36f67243ec41"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.733950 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdk5h" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.735637 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" event={"ID":"8a112c7a-6133-483e-b34c-f12bfcd7a4ac","Type":"ContainerStarted","Data":"0ea413d003fe60d83fd3d5fd15361e5ee2fdbb41e4d55e514f1301b639a9e64a"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.735673 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" event={"ID":"8a112c7a-6133-483e-b34c-f12bfcd7a4ac","Type":"ContainerStarted","Data":"537566bcae2786d5300b0748f34e075eb10192a43e6384410734109b1504a9ae"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.737450 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.741970 4822 scope.go:117] "RemoveContainer" containerID="becece30a8d5758c96991bf8469d9db283b6795184df0df4248fdb7a0d8033b7" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.742876 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.744090 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6ndhv" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.744065 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6ndhv" event={"ID":"ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa","Type":"ContainerDied","Data":"510e5bf3723c48756ddd8f775ca28addb869aa8bd96bded354d68ffc41e32572"} Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.757608 4822 scope.go:117] "RemoveContainer" containerID="6b01067c9e4cc89b921f20f4c2950d062b6e93c886cc55bb0f526df4c4b6b613" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.776710 4822 scope.go:117] "RemoveContainer" containerID="f05936b518bd89d54b5b85413c66e23cffeac395745311b2c6bde21d5ee7a831" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.782289 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7zj69" podStartSLOduration=1.782251047 podStartE2EDuration="1.782251047s" podCreationTimestamp="2025-10-10 06:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:28:34.769996693 +0000 UTC m=+261.865154889" watchObservedRunningTime="2025-10-10 06:28:34.782251047 +0000 UTC m=+261.877409243" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.792203 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvtdr"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.794447 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cvtdr"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.809733 4822 scope.go:117] "RemoveContainer" containerID="33a507a0e13ce77fe120279090db238ed8801316cd65b4a34718b11ad7ae6986" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.811830 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jcm6q"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.816283 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jcm6q"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.825110 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9mx4"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.827774 4822 scope.go:117] "RemoveContainer" containerID="8dc97415c655b1481305faf39d5fac27194d36ef7dc2f2e18282c2a5a42c5614" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.830090 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9mx4"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.835207 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdk5h"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.842150 4822 scope.go:117] "RemoveContainer" containerID="fd887a5df83904b93e58886b65109b85f8175c37973a492f1deef0ac727bdb30" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.844488 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fdk5h"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.858220 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6ndhv"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.862016 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6ndhv"] Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.866712 4822 scope.go:117] "RemoveContainer" containerID="76a1e5e483a6ea1d2237086cd980fcd80162acd3258b6a82fbbe3bbebe5621d9" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.887658 4822 scope.go:117] "RemoveContainer" containerID="1c64c0d1996880b66145e5297dca4266df4ec140ce59e6511d3699cd447a37d6" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.900374 4822 scope.go:117] "RemoveContainer" containerID="825ec843bd69f3b6e04dd0e35407995c786ba6dd53a3fe81642dfc322247e64d" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.913157 4822 scope.go:117] "RemoveContainer" containerID="900eaec8b37e97d4ffb775bbe75695003980af3972fc02cf8e705b5f79acfe47" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.925681 4822 scope.go:117] "RemoveContainer" containerID="2b147d90ee0eb587f684606327f53e2344806f07030864e5b21923066653e9e1" Oct 10 06:28:34 crc kubenswrapper[4822]: I1010 06:28:34.942617 4822 scope.go:117] "RemoveContainer" containerID="b5d6f8bd5fffbd7bdc00a7e93712079358b07196854b9ddd965966b3e916964a" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.661232 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" path="/var/lib/kubelet/pods/14d101d5-fd79-404a-9c5c-157e42608ae5/volumes" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.662408 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d359f6-a748-4388-92f2-497f21cca720" path="/var/lib/kubelet/pods/35d359f6-a748-4388-92f2-497f21cca720/volumes" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.663153 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" path="/var/lib/kubelet/pods/929f6f10-eafc-40d0-9517-4cdd93f448ba/volumes" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.664472 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a176335f-a8bb-476a-bc6d-540be238a200" path="/var/lib/kubelet/pods/a176335f-a8bb-476a-bc6d-540be238a200/volumes" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.665252 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" path="/var/lib/kubelet/pods/ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa/volumes" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.681770 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6xmrd"] Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682041 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682057 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682076 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682084 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682097 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682104 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682116 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682124 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682133 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682141 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682150 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682158 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682172 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682180 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682189 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682197 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682210 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682217 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682225 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682233 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682242 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682249 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682261 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682269 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="extract-utilities" Oct 10 06:28:35 crc kubenswrapper[4822]: E1010 06:28:35.682281 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682288 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="extract-content" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682388 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d359f6-a748-4388-92f2-497f21cca720" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682403 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="929f6f10-eafc-40d0-9517-4cdd93f448ba" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682411 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d101d5-fd79-404a-9c5c-157e42608ae5" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682425 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc5cd2c-5b8b-462f-bff5-3b8dfff8a5fa" containerName="registry-server" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.682434 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a176335f-a8bb-476a-bc6d-540be238a200" containerName="marketplace-operator" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.683520 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.689236 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.689521 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xmrd"] Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.817372 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdsm\" (UniqueName: \"kubernetes.io/projected/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-kube-api-access-fpdsm\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.817439 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-utilities\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.817511 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-catalog-content\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.882566 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6qnhn"] Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.883444 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.888296 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.894785 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qnhn"] Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.918288 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-catalog-content\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.918427 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdsm\" (UniqueName: \"kubernetes.io/projected/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-kube-api-access-fpdsm\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.918463 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-utilities\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.918727 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-catalog-content\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.918830 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-utilities\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:35 crc kubenswrapper[4822]: I1010 06:28:35.937789 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdsm\" (UniqueName: \"kubernetes.io/projected/4e6c6fef-78fc-44a1-8838-562a7eb63f8c-kube-api-access-fpdsm\") pod \"redhat-marketplace-6xmrd\" (UID: \"4e6c6fef-78fc-44a1-8838-562a7eb63f8c\") " pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.003729 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.019579 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4qz\" (UniqueName: \"kubernetes.io/projected/6b8cf7ab-4f72-4127-9e04-ef062701505a-kube-api-access-qz4qz\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.019629 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8cf7ab-4f72-4127-9e04-ef062701505a-catalog-content\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.019677 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8cf7ab-4f72-4127-9e04-ef062701505a-utilities\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.121035 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4qz\" (UniqueName: \"kubernetes.io/projected/6b8cf7ab-4f72-4127-9e04-ef062701505a-kube-api-access-qz4qz\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.122008 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8cf7ab-4f72-4127-9e04-ef062701505a-catalog-content\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.122065 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8cf7ab-4f72-4127-9e04-ef062701505a-utilities\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.122510 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8cf7ab-4f72-4127-9e04-ef062701505a-catalog-content\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.122524 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8cf7ab-4f72-4127-9e04-ef062701505a-utilities\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.147545 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4qz\" (UniqueName: \"kubernetes.io/projected/6b8cf7ab-4f72-4127-9e04-ef062701505a-kube-api-access-qz4qz\") pod \"redhat-operators-6qnhn\" (UID: \"6b8cf7ab-4f72-4127-9e04-ef062701505a\") " pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.201614 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.384917 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qnhn"] Oct 10 06:28:36 crc kubenswrapper[4822]: W1010 06:28:36.395467 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8cf7ab_4f72_4127_9e04_ef062701505a.slice/crio-f17e846b62cc5b1c0337a97eeb45db1dd6514ea65683e8a2a2f79e3085d8678d WatchSource:0}: Error finding container f17e846b62cc5b1c0337a97eeb45db1dd6514ea65683e8a2a2f79e3085d8678d: Status 404 returned error can't find the container with id f17e846b62cc5b1c0337a97eeb45db1dd6514ea65683e8a2a2f79e3085d8678d Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.407917 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xmrd"] Oct 10 06:28:36 crc kubenswrapper[4822]: W1010 06:28:36.413278 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e6c6fef_78fc_44a1_8838_562a7eb63f8c.slice/crio-382bfb0c5c835ac9412551c1a2520289e37e1a01b2e4236304d8887364161478 WatchSource:0}: Error finding container 382bfb0c5c835ac9412551c1a2520289e37e1a01b2e4236304d8887364161478: Status 404 returned error can't find the container with id 382bfb0c5c835ac9412551c1a2520289e37e1a01b2e4236304d8887364161478 Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.765527 4822 generic.go:334] "Generic (PLEG): container finished" podID="6b8cf7ab-4f72-4127-9e04-ef062701505a" containerID="1921c4fa6edc8b8823db6fa0d8ef2e9f333226717f7c9e54c11a296b9126a825" exitCode=0 Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.765593 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qnhn" event={"ID":"6b8cf7ab-4f72-4127-9e04-ef062701505a","Type":"ContainerDied","Data":"1921c4fa6edc8b8823db6fa0d8ef2e9f333226717f7c9e54c11a296b9126a825"} Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.765621 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qnhn" event={"ID":"6b8cf7ab-4f72-4127-9e04-ef062701505a","Type":"ContainerStarted","Data":"f17e846b62cc5b1c0337a97eeb45db1dd6514ea65683e8a2a2f79e3085d8678d"} Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.769357 4822 generic.go:334] "Generic (PLEG): container finished" podID="4e6c6fef-78fc-44a1-8838-562a7eb63f8c" containerID="ee25e7a721206783231b4e13df87e0cdcbd1b5c24b7781b9a23f7a0a70025361" exitCode=0 Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.769417 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xmrd" event={"ID":"4e6c6fef-78fc-44a1-8838-562a7eb63f8c","Type":"ContainerDied","Data":"ee25e7a721206783231b4e13df87e0cdcbd1b5c24b7781b9a23f7a0a70025361"} Oct 10 06:28:36 crc kubenswrapper[4822]: I1010 06:28:36.769446 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xmrd" event={"ID":"4e6c6fef-78fc-44a1-8838-562a7eb63f8c","Type":"ContainerStarted","Data":"382bfb0c5c835ac9412551c1a2520289e37e1a01b2e4236304d8887364161478"} Oct 10 06:28:37 crc kubenswrapper[4822]: I1010 06:28:37.775403 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qnhn" event={"ID":"6b8cf7ab-4f72-4127-9e04-ef062701505a","Type":"ContainerStarted","Data":"86edc102c4140bfeb8f9a13e08b2d114f4a7afe82214f35261b215487cc9e022"} Oct 10 06:28:37 crc kubenswrapper[4822]: I1010 06:28:37.777659 4822 generic.go:334] "Generic (PLEG): container finished" podID="4e6c6fef-78fc-44a1-8838-562a7eb63f8c" containerID="c7371e3bcc477b75a36526f85f9e8f06f6d0a53e4562d84c2f2b9098be8c5f14" exitCode=0 Oct 10 06:28:37 crc kubenswrapper[4822]: I1010 06:28:37.777700 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xmrd" event={"ID":"4e6c6fef-78fc-44a1-8838-562a7eb63f8c","Type":"ContainerDied","Data":"c7371e3bcc477b75a36526f85f9e8f06f6d0a53e4562d84c2f2b9098be8c5f14"} Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.078148 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fmd4b"] Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.081558 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.088039 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmd4b"] Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.088505 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.247003 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbd1ffe-6c24-4eae-861f-de345f3f855f-catalog-content\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.247078 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68mg\" (UniqueName: \"kubernetes.io/projected/4dbd1ffe-6c24-4eae-861f-de345f3f855f-kube-api-access-r68mg\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.247101 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbd1ffe-6c24-4eae-861f-de345f3f855f-utilities\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.280319 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-724mg"] Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.284409 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.287354 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.291582 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-724mg"] Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.349056 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68mg\" (UniqueName: \"kubernetes.io/projected/4dbd1ffe-6c24-4eae-861f-de345f3f855f-kube-api-access-r68mg\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.349119 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbd1ffe-6c24-4eae-861f-de345f3f855f-utilities\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.349222 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbd1ffe-6c24-4eae-861f-de345f3f855f-catalog-content\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.350306 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbd1ffe-6c24-4eae-861f-de345f3f855f-utilities\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.350406 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbd1ffe-6c24-4eae-861f-de345f3f855f-catalog-content\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.373543 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68mg\" (UniqueName: \"kubernetes.io/projected/4dbd1ffe-6c24-4eae-861f-de345f3f855f-kube-api-access-r68mg\") pod \"certified-operators-fmd4b\" (UID: \"4dbd1ffe-6c24-4eae-861f-de345f3f855f\") " pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.450619 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.450976 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnj4p\" (UniqueName: \"kubernetes.io/projected/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-kube-api-access-tnj4p\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.451070 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-utilities\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.451110 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-catalog-content\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.552669 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-utilities\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.553124 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-catalog-content\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.553190 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnj4p\" (UniqueName: \"kubernetes.io/projected/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-kube-api-access-tnj4p\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.553624 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-utilities\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.553637 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-catalog-content\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.577702 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnj4p\" (UniqueName: \"kubernetes.io/projected/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-kube-api-access-tnj4p\") pod \"community-operators-724mg\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.625131 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.656478 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmd4b"] Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.789066 4822 generic.go:334] "Generic (PLEG): container finished" podID="6b8cf7ab-4f72-4127-9e04-ef062701505a" containerID="86edc102c4140bfeb8f9a13e08b2d114f4a7afe82214f35261b215487cc9e022" exitCode=0 Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.789131 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qnhn" event={"ID":"6b8cf7ab-4f72-4127-9e04-ef062701505a","Type":"ContainerDied","Data":"86edc102c4140bfeb8f9a13e08b2d114f4a7afe82214f35261b215487cc9e022"} Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.795673 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xmrd" event={"ID":"4e6c6fef-78fc-44a1-8838-562a7eb63f8c","Type":"ContainerStarted","Data":"3047902adef3241171e24fcadf64b902436ba70a2194c2671f2145bfd4d83c6d"} Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.799137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmd4b" event={"ID":"4dbd1ffe-6c24-4eae-861f-de345f3f855f","Type":"ContainerStarted","Data":"32ed8414cae4f9939cc8bb246e51412d840f556eaecaec63f4f69d0947475da2"} Oct 10 06:28:38 crc kubenswrapper[4822]: I1010 06:28:38.825249 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6xmrd" podStartSLOduration=2.392496376 podStartE2EDuration="3.825227994s" podCreationTimestamp="2025-10-10 06:28:35 +0000 UTC" firstStartedPulling="2025-10-10 06:28:36.770588436 +0000 UTC m=+263.865746632" lastFinishedPulling="2025-10-10 06:28:38.203320054 +0000 UTC m=+265.298478250" observedRunningTime="2025-10-10 06:28:38.822378409 +0000 UTC m=+265.917536595" watchObservedRunningTime="2025-10-10 06:28:38.825227994 +0000 UTC m=+265.920386190" Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.017460 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-724mg"] Oct 10 06:28:39 crc kubenswrapper[4822]: W1010 06:28:39.023740 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc48094cd_a9ab_4c00_9e04_cc5bcaa99716.slice/crio-fbd536e1b072ebb7ccb1029f09f2662ed42b0cb1c284ed6a4244ae3970db0ce5 WatchSource:0}: Error finding container fbd536e1b072ebb7ccb1029f09f2662ed42b0cb1c284ed6a4244ae3970db0ce5: Status 404 returned error can't find the container with id fbd536e1b072ebb7ccb1029f09f2662ed42b0cb1c284ed6a4244ae3970db0ce5 Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.807407 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qnhn" event={"ID":"6b8cf7ab-4f72-4127-9e04-ef062701505a","Type":"ContainerStarted","Data":"810b26b7f2abb2847678dbf18c21b2d90231ca619f490b507208ef7257ed3a5c"} Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.809104 4822 generic.go:334] "Generic (PLEG): container finished" podID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerID="0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304" exitCode=0 Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.809349 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerDied","Data":"0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304"} Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.809407 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerStarted","Data":"fbd536e1b072ebb7ccb1029f09f2662ed42b0cb1c284ed6a4244ae3970db0ce5"} Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.810876 4822 generic.go:334] "Generic (PLEG): container finished" podID="4dbd1ffe-6c24-4eae-861f-de345f3f855f" containerID="04c452cb1e073ffd36eef7bdca6831a9c959bf6fdaef8b7cfe08b48dcc18abcf" exitCode=0 Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.811095 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmd4b" event={"ID":"4dbd1ffe-6c24-4eae-861f-de345f3f855f","Type":"ContainerDied","Data":"04c452cb1e073ffd36eef7bdca6831a9c959bf6fdaef8b7cfe08b48dcc18abcf"} Oct 10 06:28:39 crc kubenswrapper[4822]: I1010 06:28:39.829787 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6qnhn" podStartSLOduration=2.364463348 podStartE2EDuration="4.829754601s" podCreationTimestamp="2025-10-10 06:28:35 +0000 UTC" firstStartedPulling="2025-10-10 06:28:36.766959668 +0000 UTC m=+263.862117864" lastFinishedPulling="2025-10-10 06:28:39.232250921 +0000 UTC m=+266.327409117" observedRunningTime="2025-10-10 06:28:39.826654419 +0000 UTC m=+266.921812625" watchObservedRunningTime="2025-10-10 06:28:39.829754601 +0000 UTC m=+266.924912797" Oct 10 06:28:40 crc kubenswrapper[4822]: I1010 06:28:40.819485 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerStarted","Data":"71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33"} Oct 10 06:28:40 crc kubenswrapper[4822]: I1010 06:28:40.821527 4822 generic.go:334] "Generic (PLEG): container finished" podID="4dbd1ffe-6c24-4eae-861f-de345f3f855f" containerID="530277f67780671b00692a18bc837ca7623aecf52d8f6a368597587cdec8b183" exitCode=0 Oct 10 06:28:40 crc kubenswrapper[4822]: I1010 06:28:40.821622 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmd4b" event={"ID":"4dbd1ffe-6c24-4eae-861f-de345f3f855f","Type":"ContainerDied","Data":"530277f67780671b00692a18bc837ca7623aecf52d8f6a368597587cdec8b183"} Oct 10 06:28:41 crc kubenswrapper[4822]: I1010 06:28:41.828420 4822 generic.go:334] "Generic (PLEG): container finished" podID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerID="71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33" exitCode=0 Oct 10 06:28:41 crc kubenswrapper[4822]: I1010 06:28:41.828542 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerDied","Data":"71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33"} Oct 10 06:28:42 crc kubenswrapper[4822]: I1010 06:28:42.836602 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerStarted","Data":"aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599"} Oct 10 06:28:42 crc kubenswrapper[4822]: I1010 06:28:42.839025 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmd4b" event={"ID":"4dbd1ffe-6c24-4eae-861f-de345f3f855f","Type":"ContainerStarted","Data":"d10b7b842191aa9eb4ca5557cdefcd5b80696f238dd165c68544e41b6ab75bdf"} Oct 10 06:28:42 crc kubenswrapper[4822]: I1010 06:28:42.857157 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-724mg" podStartSLOduration=2.401430741 podStartE2EDuration="4.857141278s" podCreationTimestamp="2025-10-10 06:28:38 +0000 UTC" firstStartedPulling="2025-10-10 06:28:39.810455485 +0000 UTC m=+266.905613681" lastFinishedPulling="2025-10-10 06:28:42.266166012 +0000 UTC m=+269.361324218" observedRunningTime="2025-10-10 06:28:42.856885431 +0000 UTC m=+269.952043637" watchObservedRunningTime="2025-10-10 06:28:42.857141278 +0000 UTC m=+269.952299474" Oct 10 06:28:42 crc kubenswrapper[4822]: I1010 06:28:42.875842 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fmd4b" podStartSLOduration=3.3833852269999998 podStartE2EDuration="4.875824836s" podCreationTimestamp="2025-10-10 06:28:38 +0000 UTC" firstStartedPulling="2025-10-10 06:28:39.813975341 +0000 UTC m=+266.909133537" lastFinishedPulling="2025-10-10 06:28:41.30641495 +0000 UTC m=+268.401573146" observedRunningTime="2025-10-10 06:28:42.873489206 +0000 UTC m=+269.968647412" watchObservedRunningTime="2025-10-10 06:28:42.875824836 +0000 UTC m=+269.970983042" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.004259 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.004690 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.048522 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.201972 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.202040 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.246590 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.896891 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6xmrd" Oct 10 06:28:46 crc kubenswrapper[4822]: I1010 06:28:46.897283 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6qnhn" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.451725 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.452111 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.494116 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.625560 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.625647 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.665434 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-724mg" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.905108 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fmd4b" Oct 10 06:28:48 crc kubenswrapper[4822]: I1010 06:28:48.907969 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-724mg" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.136395 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk"] Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.137732 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.139768 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.143020 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.149305 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk"] Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.313656 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70328244-19ca-4109-91cf-092435c7485c-config-volume\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.313913 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70328244-19ca-4109-91cf-092435c7485c-secret-volume\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.314022 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgfr7\" (UniqueName: \"kubernetes.io/projected/70328244-19ca-4109-91cf-092435c7485c-kube-api-access-vgfr7\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.415253 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70328244-19ca-4109-91cf-092435c7485c-secret-volume\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.415998 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgfr7\" (UniqueName: \"kubernetes.io/projected/70328244-19ca-4109-91cf-092435c7485c-kube-api-access-vgfr7\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.416131 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70328244-19ca-4109-91cf-092435c7485c-config-volume\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.417083 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70328244-19ca-4109-91cf-092435c7485c-config-volume\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.421577 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70328244-19ca-4109-91cf-092435c7485c-secret-volume\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.431526 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgfr7\" (UniqueName: \"kubernetes.io/projected/70328244-19ca-4109-91cf-092435c7485c-kube-api-access-vgfr7\") pod \"collect-profiles-29334630-zclqk\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.457683 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:00 crc kubenswrapper[4822]: I1010 06:30:00.649707 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk"] Oct 10 06:30:01 crc kubenswrapper[4822]: I1010 06:30:01.279036 4822 generic.go:334] "Generic (PLEG): container finished" podID="70328244-19ca-4109-91cf-092435c7485c" containerID="a229f0517e627499c510ad3b7f4c030846600b19d5301ea3c6a7e4ea5ca71156" exitCode=0 Oct 10 06:30:01 crc kubenswrapper[4822]: I1010 06:30:01.279081 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" event={"ID":"70328244-19ca-4109-91cf-092435c7485c","Type":"ContainerDied","Data":"a229f0517e627499c510ad3b7f4c030846600b19d5301ea3c6a7e4ea5ca71156"} Oct 10 06:30:01 crc kubenswrapper[4822]: I1010 06:30:01.279119 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" event={"ID":"70328244-19ca-4109-91cf-092435c7485c","Type":"ContainerStarted","Data":"0f8d867fa91a6b3cfa42fa3381b0dd52eaf2a5770e52ef90890b0a1cccb48285"} Oct 10 06:30:01 crc kubenswrapper[4822]: I1010 06:30:01.336455 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:30:01 crc kubenswrapper[4822]: I1010 06:30:01.336525 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.504759 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.644574 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgfr7\" (UniqueName: \"kubernetes.io/projected/70328244-19ca-4109-91cf-092435c7485c-kube-api-access-vgfr7\") pod \"70328244-19ca-4109-91cf-092435c7485c\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.644690 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70328244-19ca-4109-91cf-092435c7485c-secret-volume\") pod \"70328244-19ca-4109-91cf-092435c7485c\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.644729 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70328244-19ca-4109-91cf-092435c7485c-config-volume\") pod \"70328244-19ca-4109-91cf-092435c7485c\" (UID: \"70328244-19ca-4109-91cf-092435c7485c\") " Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.645367 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70328244-19ca-4109-91cf-092435c7485c-config-volume" (OuterVolumeSpecName: "config-volume") pod "70328244-19ca-4109-91cf-092435c7485c" (UID: "70328244-19ca-4109-91cf-092435c7485c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.650368 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70328244-19ca-4109-91cf-092435c7485c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "70328244-19ca-4109-91cf-092435c7485c" (UID: "70328244-19ca-4109-91cf-092435c7485c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.650554 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70328244-19ca-4109-91cf-092435c7485c-kube-api-access-vgfr7" (OuterVolumeSpecName: "kube-api-access-vgfr7") pod "70328244-19ca-4109-91cf-092435c7485c" (UID: "70328244-19ca-4109-91cf-092435c7485c"). InnerVolumeSpecName "kube-api-access-vgfr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.752094 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgfr7\" (UniqueName: \"kubernetes.io/projected/70328244-19ca-4109-91cf-092435c7485c-kube-api-access-vgfr7\") on node \"crc\" DevicePath \"\"" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.752137 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70328244-19ca-4109-91cf-092435c7485c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:30:02 crc kubenswrapper[4822]: I1010 06:30:02.752190 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70328244-19ca-4109-91cf-092435c7485c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:30:03 crc kubenswrapper[4822]: I1010 06:30:03.292610 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" event={"ID":"70328244-19ca-4109-91cf-092435c7485c","Type":"ContainerDied","Data":"0f8d867fa91a6b3cfa42fa3381b0dd52eaf2a5770e52ef90890b0a1cccb48285"} Oct 10 06:30:03 crc kubenswrapper[4822]: I1010 06:30:03.292687 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk" Oct 10 06:30:03 crc kubenswrapper[4822]: I1010 06:30:03.292695 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8d867fa91a6b3cfa42fa3381b0dd52eaf2a5770e52ef90890b0a1cccb48285" Oct 10 06:30:31 crc kubenswrapper[4822]: I1010 06:30:31.336612 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:30:31 crc kubenswrapper[4822]: I1010 06:30:31.337299 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.337344 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.337918 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.337972 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.338635 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"655b42e8ceedf984b323c00ca6aee28025c1d29693f1d01235c31b26947c6c14"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.338684 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://655b42e8ceedf984b323c00ca6aee28025c1d29693f1d01235c31b26947c6c14" gracePeriod=600 Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.657647 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="655b42e8ceedf984b323c00ca6aee28025c1d29693f1d01235c31b26947c6c14" exitCode=0 Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.657747 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"655b42e8ceedf984b323c00ca6aee28025c1d29693f1d01235c31b26947c6c14"} Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.658026 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"e955f367f73ed0e164cc92a00b251ea7a107033da0a713ece34f1828bba18bd1"} Oct 10 06:31:01 crc kubenswrapper[4822]: I1010 06:31:01.658082 4822 scope.go:117] "RemoveContainer" containerID="d33d122338c60c6ebb67a067ca52e553ae286a2d102ebf7ad2cf8b8a6427b6bf" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.568607 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2k8n"] Oct 10 06:31:44 crc kubenswrapper[4822]: E1010 06:31:44.569359 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70328244-19ca-4109-91cf-092435c7485c" containerName="collect-profiles" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.569372 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="70328244-19ca-4109-91cf-092435c7485c" containerName="collect-profiles" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.569468 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="70328244-19ca-4109-91cf-092435c7485c" containerName="collect-profiles" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.569839 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.593044 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2k8n"] Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769078 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-bound-sa-token\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769332 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ecf9801-c498-4a79-9ac9-1984dafd705f-trusted-ca\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769462 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzggj\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-kube-api-access-tzggj\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769561 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-registry-tls\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769643 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ecf9801-c498-4a79-9ac9-1984dafd705f-registry-certificates\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769708 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ecf9801-c498-4a79-9ac9-1984dafd705f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769830 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.769906 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ecf9801-c498-4a79-9ac9-1984dafd705f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.806771 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.871083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ecf9801-c498-4a79-9ac9-1984dafd705f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.871407 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-bound-sa-token\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.871614 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ecf9801-c498-4a79-9ac9-1984dafd705f-trusted-ca\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.871841 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzggj\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-kube-api-access-tzggj\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.872046 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-registry-tls\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.872588 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ecf9801-c498-4a79-9ac9-1984dafd705f-registry-certificates\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.871838 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ecf9801-c498-4a79-9ac9-1984dafd705f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.872783 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ecf9801-c498-4a79-9ac9-1984dafd705f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.873324 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ecf9801-c498-4a79-9ac9-1984dafd705f-trusted-ca\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.874671 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ecf9801-c498-4a79-9ac9-1984dafd705f-registry-certificates\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.878323 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-registry-tls\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.878341 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ecf9801-c498-4a79-9ac9-1984dafd705f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.888798 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-bound-sa-token\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:44 crc kubenswrapper[4822]: I1010 06:31:44.900112 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzggj\" (UniqueName: \"kubernetes.io/projected/2ecf9801-c498-4a79-9ac9-1984dafd705f-kube-api-access-tzggj\") pod \"image-registry-66df7c8f76-p2k8n\" (UID: \"2ecf9801-c498-4a79-9ac9-1984dafd705f\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:45 crc kubenswrapper[4822]: I1010 06:31:45.187220 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:45 crc kubenswrapper[4822]: I1010 06:31:45.402500 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2k8n"] Oct 10 06:31:45 crc kubenswrapper[4822]: I1010 06:31:45.920861 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" event={"ID":"2ecf9801-c498-4a79-9ac9-1984dafd705f","Type":"ContainerStarted","Data":"bb9b268590f07b301f18c4e81b82553d83fadbb174dadb8bd692116c67103335"} Oct 10 06:31:45 crc kubenswrapper[4822]: I1010 06:31:45.921128 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:31:45 crc kubenswrapper[4822]: I1010 06:31:45.921139 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" event={"ID":"2ecf9801-c498-4a79-9ac9-1984dafd705f","Type":"ContainerStarted","Data":"09396e92fc01c58e8e7c09bf917a2673f222eaf90ead7aaf3e81bcdf30eccde9"} Oct 10 06:32:05 crc kubenswrapper[4822]: I1010 06:32:05.195266 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" Oct 10 06:32:05 crc kubenswrapper[4822]: I1010 06:32:05.233124 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p2k8n" podStartSLOduration=21.233080544 podStartE2EDuration="21.233080544s" podCreationTimestamp="2025-10-10 06:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:31:45.958183526 +0000 UTC m=+453.053341732" watchObservedRunningTime="2025-10-10 06:32:05.233080544 +0000 UTC m=+472.328238760" Oct 10 06:32:05 crc kubenswrapper[4822]: I1010 06:32:05.254947 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tvjnx"] Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.300072 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" podUID="c27ef059-d8bc-44a1-8940-bcb6031a72b1" containerName="registry" containerID="cri-o://857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef" gracePeriod=30 Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.693942 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.830893 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27ef059-d8bc-44a1-8940-bcb6031a72b1-installation-pull-secrets\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.830985 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-bound-sa-token\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.831062 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-tls\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.831107 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-trusted-ca\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.831167 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-certificates\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.831243 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbrj\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-kube-api-access-7pbrj\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.831283 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27ef059-d8bc-44a1-8940-bcb6031a72b1-ca-trust-extracted\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.831454 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\" (UID: \"c27ef059-d8bc-44a1-8940-bcb6031a72b1\") " Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.832337 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.832425 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.839877 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.847056 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-kube-api-access-7pbrj" (OuterVolumeSpecName: "kube-api-access-7pbrj") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "kube-api-access-7pbrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.847248 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.847325 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27ef059-d8bc-44a1-8940-bcb6031a72b1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.847704 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.851687 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c27ef059-d8bc-44a1-8940-bcb6031a72b1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c27ef059-d8bc-44a1-8940-bcb6031a72b1" (UID: "c27ef059-d8bc-44a1-8940-bcb6031a72b1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933494 4822 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27ef059-d8bc-44a1-8940-bcb6031a72b1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933548 4822 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933566 4822 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933583 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933599 4822 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27ef059-d8bc-44a1-8940-bcb6031a72b1-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933616 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pbrj\" (UniqueName: \"kubernetes.io/projected/c27ef059-d8bc-44a1-8940-bcb6031a72b1-kube-api-access-7pbrj\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:30 crc kubenswrapper[4822]: I1010 06:32:30.933633 4822 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27ef059-d8bc-44a1-8940-bcb6031a72b1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.204302 4822 generic.go:334] "Generic (PLEG): container finished" podID="c27ef059-d8bc-44a1-8940-bcb6031a72b1" containerID="857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef" exitCode=0 Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.204384 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" event={"ID":"c27ef059-d8bc-44a1-8940-bcb6031a72b1","Type":"ContainerDied","Data":"857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef"} Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.204419 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.204921 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tvjnx" event={"ID":"c27ef059-d8bc-44a1-8940-bcb6031a72b1","Type":"ContainerDied","Data":"8d223c4b68eaabf9a38886a2037280780e75d0b60999738079bad38058f72b5e"} Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.204965 4822 scope.go:117] "RemoveContainer" containerID="857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef" Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.227146 4822 scope.go:117] "RemoveContainer" containerID="857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef" Oct 10 06:32:31 crc kubenswrapper[4822]: E1010 06:32:31.227695 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef\": container with ID starting with 857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef not found: ID does not exist" containerID="857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef" Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.227730 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef"} err="failed to get container status \"857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef\": rpc error: code = NotFound desc = could not find container \"857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef\": container with ID starting with 857b39a90d9886a47fe9ba41cbdbee34516faace149de92e7167e616b97336ef not found: ID does not exist" Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.255276 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tvjnx"] Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.265476 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tvjnx"] Oct 10 06:32:31 crc kubenswrapper[4822]: I1010 06:32:31.663403 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27ef059-d8bc-44a1-8940-bcb6031a72b1" path="/var/lib/kubelet/pods/c27ef059-d8bc-44a1-8940-bcb6031a72b1/volumes" Oct 10 06:33:01 crc kubenswrapper[4822]: I1010 06:33:01.337684 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:33:01 crc kubenswrapper[4822]: I1010 06:33:01.338495 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:33:31 crc kubenswrapper[4822]: I1010 06:33:31.337741 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:33:31 crc kubenswrapper[4822]: I1010 06:33:31.338350 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.336389 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.337352 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.337420 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.338334 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e955f367f73ed0e164cc92a00b251ea7a107033da0a713ece34f1828bba18bd1"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.338441 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://e955f367f73ed0e164cc92a00b251ea7a107033da0a713ece34f1828bba18bd1" gracePeriod=600 Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.772784 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="e955f367f73ed0e164cc92a00b251ea7a107033da0a713ece34f1828bba18bd1" exitCode=0 Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.773007 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"e955f367f73ed0e164cc92a00b251ea7a107033da0a713ece34f1828bba18bd1"} Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.773281 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"58cc4a6333405580cf6ed60b3760010cf9fb05805218283cbce7d19149e2db60"} Oct 10 06:34:01 crc kubenswrapper[4822]: I1010 06:34:01.773337 4822 scope.go:117] "RemoveContainer" containerID="655b42e8ceedf984b323c00ca6aee28025c1d29693f1d01235c31b26947c6c14" Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.845655 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bzbn"] Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.847226 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-controller" containerID="cri-o://3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.847879 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="sbdb" containerID="cri-o://331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.848067 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="nbdb" containerID="cri-o://921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.848166 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="northd" containerID="cri-o://5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.848254 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.848344 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-node" containerID="cri-o://f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.848423 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-acl-logging" containerID="cri-o://8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" gracePeriod=30 Oct 10 06:35:48 crc kubenswrapper[4822]: I1010 06:35:48.896676 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" containerID="cri-o://592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" gracePeriod=30 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.182137 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/3.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.184317 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovn-acl-logging/0.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.184910 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovn-controller/0.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.185289 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208308 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-bin\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208356 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-systemd\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208389 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovn-node-metrics-cert\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208412 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208432 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-netns\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208455 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-env-overrides\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208473 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-kubelet\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208497 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-script-lib\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208514 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-log-socket\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208553 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cngm6\" (UniqueName: \"kubernetes.io/projected/2bd611ad-9a8c-489f-903b-d75912bb1fef-kube-api-access-cngm6\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208576 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-config\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208602 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-ovn\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208623 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-etc-openvswitch\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208648 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-netd\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208669 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-var-lib-openvswitch\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208689 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-ovn-kubernetes\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208710 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-systemd-units\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208735 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-slash\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208755 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-node-log\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208775 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-openvswitch\") pod \"2bd611ad-9a8c-489f-903b-d75912bb1fef\" (UID: \"2bd611ad-9a8c-489f-903b-d75912bb1fef\") " Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.208961 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209418 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209452 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209475 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209500 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209523 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209546 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209570 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209591 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-slash" (OuterVolumeSpecName: "host-slash") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209611 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-node-log" (OuterVolumeSpecName: "node-log") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209633 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.209652 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.211438 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.211881 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.212508 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.212537 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.214205 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-log-socket" (OuterVolumeSpecName: "log-socket") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.215318 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.222325 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd611ad-9a8c-489f-903b-d75912bb1fef-kube-api-access-cngm6" (OuterVolumeSpecName: "kube-api-access-cngm6") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "kube-api-access-cngm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.227576 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2bd611ad-9a8c-489f-903b-d75912bb1fef" (UID: "2bd611ad-9a8c-489f-903b-d75912bb1fef"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243644 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vtt5z"] Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243878 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243891 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243900 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243906 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243914 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kubecfg-setup" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243920 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kubecfg-setup" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243928 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="sbdb" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243934 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="sbdb" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243943 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-node" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243949 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-node" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243957 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243963 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243971 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243976 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.243986 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27ef059-d8bc-44a1-8940-bcb6031a72b1" containerName="registry" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.243992 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27ef059-d8bc-44a1-8940-bcb6031a72b1" containerName="registry" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.244000 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-acl-logging" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244006 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-acl-logging" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.244015 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-ovn-metrics" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244021 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-ovn-metrics" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.244032 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="northd" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244038 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="northd" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.244045 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="nbdb" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244051 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="nbdb" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244156 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244164 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244170 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-ovn-metrics" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244180 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="sbdb" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244188 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27ef059-d8bc-44a1-8940-bcb6031a72b1" containerName="registry" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244196 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovn-acl-logging" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244204 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244212 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244219 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="northd" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244227 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="kube-rbac-proxy-node" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244235 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="nbdb" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.244329 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244336 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.244349 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244355 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244443 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.244456 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerName="ovnkube-controller" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.246191 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309614 4822 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309662 4822 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309684 4822 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309704 4822 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309724 4822 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309742 4822 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309762 4822 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309780 4822 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-slash\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309816 4822 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-node-log\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309835 4822 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309852 4822 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309870 4822 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309888 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309908 4822 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309929 4822 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309952 4822 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309971 4822 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.309992 4822 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bd611ad-9a8c-489f-903b-d75912bb1fef-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.310016 4822 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bd611ad-9a8c-489f-903b-d75912bb1fef-log-socket\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.310056 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cngm6\" (UniqueName: \"kubernetes.io/projected/2bd611ad-9a8c-489f-903b-d75912bb1fef-kube-api-access-cngm6\") on node \"crc\" DevicePath \"\"" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411035 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411277 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-etc-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411356 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-systemd\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411437 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411512 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-var-lib-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411590 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-log-socket\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411650 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-cni-netd\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411711 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-ovnkube-script-lib\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.411816 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdg4s\" (UniqueName: \"kubernetes.io/projected/241c97cd-d574-4c5f-96e1-59bb42981db2-kube-api-access-bdg4s\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412064 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-ovn\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412134 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-slash\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412173 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-run-netns\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412204 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-ovnkube-config\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412236 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/241c97cd-d574-4c5f-96e1-59bb42981db2-ovn-node-metrics-cert\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412268 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-kubelet\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412296 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-node-log\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412340 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-systemd-units\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412383 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-env-overrides\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412422 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.412468 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-cni-bin\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.461498 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/2.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.462477 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/1.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.462552 4822 generic.go:334] "Generic (PLEG): container finished" podID="ec9c77cf-dd02-4e39-b204-9f6540406973" containerID="e0cb5307c1b8e0c07caa219c2ed304d4abf8dfe9d92d267f95340e09899f88ad" exitCode=2 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.462737 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerDied","Data":"e0cb5307c1b8e0c07caa219c2ed304d4abf8dfe9d92d267f95340e09899f88ad"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.463015 4822 scope.go:117] "RemoveContainer" containerID="29a81e9c1a9ca7ba2d9013809ac682a5af8c697d3661b40676c9582991629dc5" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.463616 4822 scope.go:117] "RemoveContainer" containerID="e0cb5307c1b8e0c07caa219c2ed304d4abf8dfe9d92d267f95340e09899f88ad" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.464228 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5x2kt_openshift-multus(ec9c77cf-dd02-4e39-b204-9f6540406973)\"" pod="openshift-multus/multus-5x2kt" podUID="ec9c77cf-dd02-4e39-b204-9f6540406973" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.466883 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovnkube-controller/3.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.470868 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovn-acl-logging/0.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471303 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bzbn_2bd611ad-9a8c-489f-903b-d75912bb1fef/ovn-controller/0.log" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471739 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" exitCode=0 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471761 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" exitCode=0 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471770 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" exitCode=0 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471777 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" exitCode=0 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471783 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" exitCode=0 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471789 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" exitCode=0 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471809 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" exitCode=143 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471816 4822 generic.go:334] "Generic (PLEG): container finished" podID="2bd611ad-9a8c-489f-903b-d75912bb1fef" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" exitCode=143 Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471834 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471857 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471868 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471878 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471886 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471896 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471911 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471927 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471933 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471938 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471936 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.471943 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472040 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472046 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472051 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472057 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472061 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472070 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472079 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472085 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472090 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472095 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472100 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472105 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472110 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472115 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472120 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472124 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472131 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472138 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472144 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472149 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472154 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472159 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472164 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472169 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472173 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472178 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472183 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472189 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bzbn" event={"ID":"2bd611ad-9a8c-489f-903b-d75912bb1fef","Type":"ContainerDied","Data":"22e0e2b44f19b33be05e55d5b2b4b4d8ce58be4385f4852b106bbb4496f19628"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472196 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472202 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472207 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472212 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472217 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472223 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472227 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472234 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472239 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.472243 4822 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.512052 4822 scope.go:117] "RemoveContainer" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514149 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bzbn"] Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514487 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514519 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-etc-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514546 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-systemd\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514579 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514635 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-var-lib-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514687 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-log-socket\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514642 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514710 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-cni-netd\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514731 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-ovnkube-script-lib\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514760 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdg4s\" (UniqueName: \"kubernetes.io/projected/241c97cd-d574-4c5f-96e1-59bb42981db2-kube-api-access-bdg4s\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514789 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-ovn\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514839 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-slash\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514862 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-run-netns\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514882 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-ovnkube-config\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514902 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/241c97cd-d574-4c5f-96e1-59bb42981db2-ovn-node-metrics-cert\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514922 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-kubelet\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514942 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-node-log\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514972 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-systemd-units\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.514996 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-env-overrides\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515016 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515039 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-cni-bin\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515117 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-cni-bin\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515157 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-var-lib-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515186 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-log-socket\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515214 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-cni-netd\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515488 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-etc-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515589 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515743 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-systemd\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.515911 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-run-netns\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516073 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-ovnkube-script-lib\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516130 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-systemd-units\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516215 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-kubelet\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516274 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-node-log\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516321 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-openvswitch\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516380 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bzbn"] Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516395 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-run-ovn\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.516615 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-env-overrides\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.517145 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/241c97cd-d574-4c5f-96e1-59bb42981db2-ovnkube-config\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.517260 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/241c97cd-d574-4c5f-96e1-59bb42981db2-host-slash\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.520416 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/241c97cd-d574-4c5f-96e1-59bb42981db2-ovn-node-metrics-cert\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.532716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdg4s\" (UniqueName: \"kubernetes.io/projected/241c97cd-d574-4c5f-96e1-59bb42981db2-kube-api-access-bdg4s\") pod \"ovnkube-node-vtt5z\" (UID: \"241c97cd-d574-4c5f-96e1-59bb42981db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.538278 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.557206 4822 scope.go:117] "RemoveContainer" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.565950 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.571960 4822 scope.go:117] "RemoveContainer" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.595849 4822 scope.go:117] "RemoveContainer" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" Oct 10 06:35:49 crc kubenswrapper[4822]: W1010 06:35:49.604375 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241c97cd_d574_4c5f_96e1_59bb42981db2.slice/crio-c6a3f5b0f93f4d0e5753874759fe5a42b36e60adc1aa398995ca3f1b639c304f WatchSource:0}: Error finding container c6a3f5b0f93f4d0e5753874759fe5a42b36e60adc1aa398995ca3f1b639c304f: Status 404 returned error can't find the container with id c6a3f5b0f93f4d0e5753874759fe5a42b36e60adc1aa398995ca3f1b639c304f Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.618452 4822 scope.go:117] "RemoveContainer" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.648050 4822 scope.go:117] "RemoveContainer" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.657752 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd611ad-9a8c-489f-903b-d75912bb1fef" path="/var/lib/kubelet/pods/2bd611ad-9a8c-489f-903b-d75912bb1fef/volumes" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.668460 4822 scope.go:117] "RemoveContainer" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.693316 4822 scope.go:117] "RemoveContainer" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.714047 4822 scope.go:117] "RemoveContainer" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.733327 4822 scope.go:117] "RemoveContainer" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.733683 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": container with ID starting with 592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9 not found: ID does not exist" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.733734 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} err="failed to get container status \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": rpc error: code = NotFound desc = could not find container \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": container with ID starting with 592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.733756 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.734095 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": container with ID starting with 605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0 not found: ID does not exist" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.734116 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} err="failed to get container status \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": rpc error: code = NotFound desc = could not find container \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": container with ID starting with 605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.734158 4822 scope.go:117] "RemoveContainer" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.734519 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": container with ID starting with 331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d not found: ID does not exist" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.734537 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} err="failed to get container status \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": rpc error: code = NotFound desc = could not find container \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": container with ID starting with 331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.734551 4822 scope.go:117] "RemoveContainer" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.734718 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": container with ID starting with 921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413 not found: ID does not exist" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.734761 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} err="failed to get container status \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": rpc error: code = NotFound desc = could not find container \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": container with ID starting with 921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.734775 4822 scope.go:117] "RemoveContainer" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.735018 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": container with ID starting with 5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4 not found: ID does not exist" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735042 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} err="failed to get container status \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": rpc error: code = NotFound desc = could not find container \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": container with ID starting with 5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735076 4822 scope.go:117] "RemoveContainer" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.735416 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": container with ID starting with 24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0 not found: ID does not exist" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735438 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} err="failed to get container status \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": rpc error: code = NotFound desc = could not find container \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": container with ID starting with 24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735450 4822 scope.go:117] "RemoveContainer" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.735633 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": container with ID starting with f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582 not found: ID does not exist" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735653 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} err="failed to get container status \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": rpc error: code = NotFound desc = could not find container \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": container with ID starting with f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735666 4822 scope.go:117] "RemoveContainer" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.735881 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": container with ID starting with 8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d not found: ID does not exist" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735900 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} err="failed to get container status \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": rpc error: code = NotFound desc = could not find container \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": container with ID starting with 8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.735912 4822 scope.go:117] "RemoveContainer" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.736135 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": container with ID starting with 3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da not found: ID does not exist" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.736153 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} err="failed to get container status \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": rpc error: code = NotFound desc = could not find container \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": container with ID starting with 3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.736164 4822 scope.go:117] "RemoveContainer" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" Oct 10 06:35:49 crc kubenswrapper[4822]: E1010 06:35:49.736523 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": container with ID starting with fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67 not found: ID does not exist" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.736566 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} err="failed to get container status \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": rpc error: code = NotFound desc = could not find container \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": container with ID starting with fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.736579 4822 scope.go:117] "RemoveContainer" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.736771 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} err="failed to get container status \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": rpc error: code = NotFound desc = could not find container \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": container with ID starting with 592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.736790 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737050 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} err="failed to get container status \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": rpc error: code = NotFound desc = could not find container \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": container with ID starting with 605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737075 4822 scope.go:117] "RemoveContainer" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737268 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} err="failed to get container status \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": rpc error: code = NotFound desc = could not find container \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": container with ID starting with 331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737286 4822 scope.go:117] "RemoveContainer" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737477 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} err="failed to get container status \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": rpc error: code = NotFound desc = could not find container \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": container with ID starting with 921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737495 4822 scope.go:117] "RemoveContainer" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737663 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} err="failed to get container status \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": rpc error: code = NotFound desc = could not find container \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": container with ID starting with 5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737681 4822 scope.go:117] "RemoveContainer" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737857 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} err="failed to get container status \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": rpc error: code = NotFound desc = could not find container \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": container with ID starting with 24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.737872 4822 scope.go:117] "RemoveContainer" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738063 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} err="failed to get container status \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": rpc error: code = NotFound desc = could not find container \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": container with ID starting with f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738080 4822 scope.go:117] "RemoveContainer" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738267 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} err="failed to get container status \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": rpc error: code = NotFound desc = could not find container \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": container with ID starting with 8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738284 4822 scope.go:117] "RemoveContainer" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738496 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} err="failed to get container status \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": rpc error: code = NotFound desc = could not find container \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": container with ID starting with 3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738512 4822 scope.go:117] "RemoveContainer" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738766 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} err="failed to get container status \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": rpc error: code = NotFound desc = could not find container \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": container with ID starting with fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.738835 4822 scope.go:117] "RemoveContainer" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739007 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} err="failed to get container status \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": rpc error: code = NotFound desc = could not find container \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": container with ID starting with 592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739024 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739265 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} err="failed to get container status \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": rpc error: code = NotFound desc = could not find container \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": container with ID starting with 605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739281 4822 scope.go:117] "RemoveContainer" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739423 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} err="failed to get container status \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": rpc error: code = NotFound desc = could not find container \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": container with ID starting with 331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739440 4822 scope.go:117] "RemoveContainer" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739589 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} err="failed to get container status \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": rpc error: code = NotFound desc = could not find container \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": container with ID starting with 921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739605 4822 scope.go:117] "RemoveContainer" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739921 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} err="failed to get container status \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": rpc error: code = NotFound desc = could not find container \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": container with ID starting with 5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.739939 4822 scope.go:117] "RemoveContainer" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740104 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} err="failed to get container status \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": rpc error: code = NotFound desc = could not find container \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": container with ID starting with 24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740118 4822 scope.go:117] "RemoveContainer" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740313 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} err="failed to get container status \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": rpc error: code = NotFound desc = could not find container \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": container with ID starting with f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740331 4822 scope.go:117] "RemoveContainer" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740474 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} err="failed to get container status \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": rpc error: code = NotFound desc = could not find container \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": container with ID starting with 8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740491 4822 scope.go:117] "RemoveContainer" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740651 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} err="failed to get container status \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": rpc error: code = NotFound desc = could not find container \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": container with ID starting with 3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740667 4822 scope.go:117] "RemoveContainer" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740864 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} err="failed to get container status \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": rpc error: code = NotFound desc = could not find container \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": container with ID starting with fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.740887 4822 scope.go:117] "RemoveContainer" containerID="592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741050 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9"} err="failed to get container status \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": rpc error: code = NotFound desc = could not find container \"592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9\": container with ID starting with 592c8c524ba3c7a4b5d4168ec5eb417c6734634d9f572a0c219bbba5efcc87f9 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741066 4822 scope.go:117] "RemoveContainer" containerID="605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741221 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0"} err="failed to get container status \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": rpc error: code = NotFound desc = could not find container \"605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0\": container with ID starting with 605c3e2e1a5eace2763a66fd6280794cd8def520d3e2d9f39b4c9bb7ccf040c0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741237 4822 scope.go:117] "RemoveContainer" containerID="331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741416 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d"} err="failed to get container status \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": rpc error: code = NotFound desc = could not find container \"331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d\": container with ID starting with 331fadc7d0796f622f3e4fd85b1821aa64b2a58f2fab566d415818432c04324d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741433 4822 scope.go:117] "RemoveContainer" containerID="921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741654 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413"} err="failed to get container status \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": rpc error: code = NotFound desc = could not find container \"921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413\": container with ID starting with 921099f8750f8d8cdff39fdf5ac1e8fc5b595ba59b1c5556025fbdad48712413 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741674 4822 scope.go:117] "RemoveContainer" containerID="5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741918 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4"} err="failed to get container status \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": rpc error: code = NotFound desc = could not find container \"5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4\": container with ID starting with 5490acf2b29105c8658307b4e65b7c95210e8d8c47e7632fe9dee83c135b74b4 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.741935 4822 scope.go:117] "RemoveContainer" containerID="24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742107 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0"} err="failed to get container status \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": rpc error: code = NotFound desc = could not find container \"24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0\": container with ID starting with 24c0cc625950e89b9ad2adbae20710c6335b71e40b4ed924205fdd3d2b1edad0 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742123 4822 scope.go:117] "RemoveContainer" containerID="f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742354 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582"} err="failed to get container status \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": rpc error: code = NotFound desc = could not find container \"f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582\": container with ID starting with f6349fd376dcdeadc4652bd72a0a3d67d1cb99a4f131ffe8b4db315fa8371582 not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742369 4822 scope.go:117] "RemoveContainer" containerID="8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742547 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d"} err="failed to get container status \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": rpc error: code = NotFound desc = could not find container \"8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d\": container with ID starting with 8921289487adadd1c8af8566673d810ba108da33655dc1200ee01f35c516fe7d not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742562 4822 scope.go:117] "RemoveContainer" containerID="3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742794 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da"} err="failed to get container status \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": rpc error: code = NotFound desc = could not find container \"3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da\": container with ID starting with 3cfa5019d003593537e2a79b71d2a9a769c8bb7bf19ef0d5a679aa7bf8b3a5da not found: ID does not exist" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.742846 4822 scope.go:117] "RemoveContainer" containerID="fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67" Oct 10 06:35:49 crc kubenswrapper[4822]: I1010 06:35:49.743066 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67"} err="failed to get container status \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": rpc error: code = NotFound desc = could not find container \"fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67\": container with ID starting with fc6a0b0f2ef7dc33eb92a3b7602270a5774b1ac1a60531ed91250bef75c8ee67 not found: ID does not exist" Oct 10 06:35:50 crc kubenswrapper[4822]: I1010 06:35:50.480999 4822 generic.go:334] "Generic (PLEG): container finished" podID="241c97cd-d574-4c5f-96e1-59bb42981db2" containerID="2928fbd1d200833d4c2c43c059319191b1e1a0cc94e6f9dcbcef513906e71808" exitCode=0 Oct 10 06:35:50 crc kubenswrapper[4822]: I1010 06:35:50.481117 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerDied","Data":"2928fbd1d200833d4c2c43c059319191b1e1a0cc94e6f9dcbcef513906e71808"} Oct 10 06:35:50 crc kubenswrapper[4822]: I1010 06:35:50.481646 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"c6a3f5b0f93f4d0e5753874759fe5a42b36e60adc1aa398995ca3f1b639c304f"} Oct 10 06:35:50 crc kubenswrapper[4822]: I1010 06:35:50.483986 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/2.log" Oct 10 06:35:51 crc kubenswrapper[4822]: I1010 06:35:51.491643 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"cd5e33dc61c3d70668ed8db5fde6b0bc1ebf6b70987c6eaaccd4e230ae1651c0"} Oct 10 06:35:51 crc kubenswrapper[4822]: I1010 06:35:51.492068 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"48b1bac9fdad0bbe87529b26690b9775991291c58d4919130b1968c3719e024c"} Oct 10 06:35:51 crc kubenswrapper[4822]: I1010 06:35:51.492080 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"236242ed9a6e29601e2811cc00bb67dbc5b1ac3b0e1eddb5b36c1085b320e4fa"} Oct 10 06:35:51 crc kubenswrapper[4822]: I1010 06:35:51.492088 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"ee919fd377678a12cbbbba10681b6972957ed374524679bfbc4e7633d316ce86"} Oct 10 06:35:51 crc kubenswrapper[4822]: I1010 06:35:51.492096 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"3a2afc97577353a509e2bfedf8652a827614eb5e4a8647266ac2ed939fd420a7"} Oct 10 06:35:51 crc kubenswrapper[4822]: I1010 06:35:51.492105 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"fa7fed754ecd2e770db2d5e06f75af708cec73f1985b6a7d6ad4f0310bbb3475"} Oct 10 06:35:54 crc kubenswrapper[4822]: I1010 06:35:54.513509 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"a6691bc25e3388bba126013bd0e1db7e7f686b77d8290f66596990bd8b8f3bdf"} Oct 10 06:35:55 crc kubenswrapper[4822]: I1010 06:35:55.892218 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bwbxr"] Oct 10 06:35:55 crc kubenswrapper[4822]: I1010 06:35:55.895680 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:55 crc kubenswrapper[4822]: I1010 06:35:55.900093 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 10 06:35:55 crc kubenswrapper[4822]: I1010 06:35:55.902041 4822 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cxqpg" Oct 10 06:35:55 crc kubenswrapper[4822]: I1010 06:35:55.902393 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 10 06:35:55 crc kubenswrapper[4822]: I1010 06:35:55.902570 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.092707 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8d7546a4-c45d-41a7-a7eb-24119b7c951a-node-mnt\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.093213 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzb7b\" (UniqueName: \"kubernetes.io/projected/8d7546a4-c45d-41a7-a7eb-24119b7c951a-kube-api-access-fzb7b\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.093293 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8d7546a4-c45d-41a7-a7eb-24119b7c951a-crc-storage\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.194560 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8d7546a4-c45d-41a7-a7eb-24119b7c951a-node-mnt\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.194643 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzb7b\" (UniqueName: \"kubernetes.io/projected/8d7546a4-c45d-41a7-a7eb-24119b7c951a-kube-api-access-fzb7b\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.194691 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8d7546a4-c45d-41a7-a7eb-24119b7c951a-crc-storage\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.194836 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8d7546a4-c45d-41a7-a7eb-24119b7c951a-node-mnt\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.195501 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8d7546a4-c45d-41a7-a7eb-24119b7c951a-crc-storage\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.216541 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzb7b\" (UniqueName: \"kubernetes.io/projected/8d7546a4-c45d-41a7-a7eb-24119b7c951a-kube-api-access-fzb7b\") pod \"crc-storage-crc-bwbxr\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.221323 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.261577 4822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(fcf8cfebe5f10e6f9380c23426edd3271f17b0e192bafa82496aa807fc430ba3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.261636 4822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(fcf8cfebe5f10e6f9380c23426edd3271f17b0e192bafa82496aa807fc430ba3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.261659 4822 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(fcf8cfebe5f10e6f9380c23426edd3271f17b0e192bafa82496aa807fc430ba3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.261701 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bwbxr_crc-storage(8d7546a4-c45d-41a7-a7eb-24119b7c951a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bwbxr_crc-storage(8d7546a4-c45d-41a7-a7eb-24119b7c951a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(fcf8cfebe5f10e6f9380c23426edd3271f17b0e192bafa82496aa807fc430ba3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bwbxr" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.525538 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" event={"ID":"241c97cd-d574-4c5f-96e1-59bb42981db2","Type":"ContainerStarted","Data":"47e4517cb9081c4a62278795b52a65b6bf68603d23d4234c59017744f28b1698"} Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.525870 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.525922 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.549934 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.568721 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" podStartSLOduration=7.568684062 podStartE2EDuration="7.568684062s" podCreationTimestamp="2025-10-10 06:35:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:35:56.550730517 +0000 UTC m=+703.645888723" watchObservedRunningTime="2025-10-10 06:35:56.568684062 +0000 UTC m=+703.663842328" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.663967 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bwbxr"] Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.664105 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: I1010 06:35:56.664553 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.685124 4822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(0e1f2a0815c9b1cfd8d3ce1481096675cff0b431abc62ca1f806f4b54c604fc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.685183 4822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(0e1f2a0815c9b1cfd8d3ce1481096675cff0b431abc62ca1f806f4b54c604fc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.685202 4822 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(0e1f2a0815c9b1cfd8d3ce1481096675cff0b431abc62ca1f806f4b54c604fc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:35:56 crc kubenswrapper[4822]: E1010 06:35:56.685248 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bwbxr_crc-storage(8d7546a4-c45d-41a7-a7eb-24119b7c951a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bwbxr_crc-storage(8d7546a4-c45d-41a7-a7eb-24119b7c951a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(0e1f2a0815c9b1cfd8d3ce1481096675cff0b431abc62ca1f806f4b54c604fc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bwbxr" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" Oct 10 06:35:57 crc kubenswrapper[4822]: I1010 06:35:57.531683 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:35:57 crc kubenswrapper[4822]: I1010 06:35:57.572003 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:36:00 crc kubenswrapper[4822]: I1010 06:36:00.650350 4822 scope.go:117] "RemoveContainer" containerID="e0cb5307c1b8e0c07caa219c2ed304d4abf8dfe9d92d267f95340e09899f88ad" Oct 10 06:36:00 crc kubenswrapper[4822]: E1010 06:36:00.651059 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5x2kt_openshift-multus(ec9c77cf-dd02-4e39-b204-9f6540406973)\"" pod="openshift-multus/multus-5x2kt" podUID="ec9c77cf-dd02-4e39-b204-9f6540406973" Oct 10 06:36:01 crc kubenswrapper[4822]: I1010 06:36:01.336641 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:36:01 crc kubenswrapper[4822]: I1010 06:36:01.336686 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:36:08 crc kubenswrapper[4822]: I1010 06:36:08.650351 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:08 crc kubenswrapper[4822]: I1010 06:36:08.651638 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:08 crc kubenswrapper[4822]: E1010 06:36:08.706141 4822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(b44fb5acd05bed427a6644e07df0bfc4cdf4b0025b685b84bb75217cdb450248): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:36:08 crc kubenswrapper[4822]: E1010 06:36:08.706214 4822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(b44fb5acd05bed427a6644e07df0bfc4cdf4b0025b685b84bb75217cdb450248): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:08 crc kubenswrapper[4822]: E1010 06:36:08.706239 4822 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(b44fb5acd05bed427a6644e07df0bfc4cdf4b0025b685b84bb75217cdb450248): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:08 crc kubenswrapper[4822]: E1010 06:36:08.706288 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bwbxr_crc-storage(8d7546a4-c45d-41a7-a7eb-24119b7c951a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bwbxr_crc-storage(8d7546a4-c45d-41a7-a7eb-24119b7c951a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bwbxr_crc-storage_8d7546a4-c45d-41a7-a7eb-24119b7c951a_0(b44fb5acd05bed427a6644e07df0bfc4cdf4b0025b685b84bb75217cdb450248): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bwbxr" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" Oct 10 06:36:14 crc kubenswrapper[4822]: I1010 06:36:14.649954 4822 scope.go:117] "RemoveContainer" containerID="e0cb5307c1b8e0c07caa219c2ed304d4abf8dfe9d92d267f95340e09899f88ad" Oct 10 06:36:15 crc kubenswrapper[4822]: I1010 06:36:15.640439 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5x2kt_ec9c77cf-dd02-4e39-b204-9f6540406973/kube-multus/2.log" Oct 10 06:36:15 crc kubenswrapper[4822]: I1010 06:36:15.640862 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5x2kt" event={"ID":"ec9c77cf-dd02-4e39-b204-9f6540406973","Type":"ContainerStarted","Data":"d6bf089a2bb821844ac7dba4becd5a3a71a79ccf66ae2a64baf7e94ed7f39772"} Oct 10 06:36:19 crc kubenswrapper[4822]: I1010 06:36:19.597628 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtt5z" Oct 10 06:36:24 crc kubenswrapper[4822]: I1010 06:36:24.649931 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:24 crc kubenswrapper[4822]: I1010 06:36:24.650873 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:24 crc kubenswrapper[4822]: I1010 06:36:24.911463 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bwbxr"] Oct 10 06:36:24 crc kubenswrapper[4822]: I1010 06:36:24.920169 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 06:36:25 crc kubenswrapper[4822]: I1010 06:36:25.705384 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bwbxr" event={"ID":"8d7546a4-c45d-41a7-a7eb-24119b7c951a","Type":"ContainerStarted","Data":"fa03136c224f5fef851c1c11ebae801fcaa5ee0260b63370071a40b24d96be9f"} Oct 10 06:36:26 crc kubenswrapper[4822]: I1010 06:36:26.714608 4822 generic.go:334] "Generic (PLEG): container finished" podID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" containerID="bd310d46d1233151146148c4bd637d08c61c1dc6f4a01fee315ce12118ba52c1" exitCode=0 Oct 10 06:36:26 crc kubenswrapper[4822]: I1010 06:36:26.714678 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bwbxr" event={"ID":"8d7546a4-c45d-41a7-a7eb-24119b7c951a","Type":"ContainerDied","Data":"bd310d46d1233151146148c4bd637d08c61c1dc6f4a01fee315ce12118ba52c1"} Oct 10 06:36:27 crc kubenswrapper[4822]: I1010 06:36:27.985707 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.106243 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzb7b\" (UniqueName: \"kubernetes.io/projected/8d7546a4-c45d-41a7-a7eb-24119b7c951a-kube-api-access-fzb7b\") pod \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.106326 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8d7546a4-c45d-41a7-a7eb-24119b7c951a-crc-storage\") pod \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.106370 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8d7546a4-c45d-41a7-a7eb-24119b7c951a-node-mnt\") pod \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\" (UID: \"8d7546a4-c45d-41a7-a7eb-24119b7c951a\") " Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.106568 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d7546a4-c45d-41a7-a7eb-24119b7c951a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8d7546a4-c45d-41a7-a7eb-24119b7c951a" (UID: "8d7546a4-c45d-41a7-a7eb-24119b7c951a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.106751 4822 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8d7546a4-c45d-41a7-a7eb-24119b7c951a-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.113415 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d7546a4-c45d-41a7-a7eb-24119b7c951a-kube-api-access-fzb7b" (OuterVolumeSpecName: "kube-api-access-fzb7b") pod "8d7546a4-c45d-41a7-a7eb-24119b7c951a" (UID: "8d7546a4-c45d-41a7-a7eb-24119b7c951a"). InnerVolumeSpecName "kube-api-access-fzb7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.128916 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d7546a4-c45d-41a7-a7eb-24119b7c951a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8d7546a4-c45d-41a7-a7eb-24119b7c951a" (UID: "8d7546a4-c45d-41a7-a7eb-24119b7c951a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.207937 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzb7b\" (UniqueName: \"kubernetes.io/projected/8d7546a4-c45d-41a7-a7eb-24119b7c951a-kube-api-access-fzb7b\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.207993 4822 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8d7546a4-c45d-41a7-a7eb-24119b7c951a-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.728548 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bwbxr" event={"ID":"8d7546a4-c45d-41a7-a7eb-24119b7c951a","Type":"ContainerDied","Data":"fa03136c224f5fef851c1c11ebae801fcaa5ee0260b63370071a40b24d96be9f"} Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.728581 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bwbxr" Oct 10 06:36:28 crc kubenswrapper[4822]: I1010 06:36:28.728587 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa03136c224f5fef851c1c11ebae801fcaa5ee0260b63370071a40b24d96be9f" Oct 10 06:36:31 crc kubenswrapper[4822]: I1010 06:36:31.337216 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:36:31 crc kubenswrapper[4822]: I1010 06:36:31.337704 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.278016 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw"] Oct 10 06:36:35 crc kubenswrapper[4822]: E1010 06:36:35.278735 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" containerName="storage" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.278747 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" containerName="storage" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.278854 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" containerName="storage" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.279845 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.282638 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.294821 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw"] Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.396563 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-schtp\" (UniqueName: \"kubernetes.io/projected/982470a6-c29e-4e2a-a83b-073df14ec4ff-kube-api-access-schtp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.396632 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.396687 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.498065 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-schtp\" (UniqueName: \"kubernetes.io/projected/982470a6-c29e-4e2a-a83b-073df14ec4ff-kube-api-access-schtp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.498173 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.498288 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.498796 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.499160 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.528302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-schtp\" (UniqueName: \"kubernetes.io/projected/982470a6-c29e-4e2a-a83b-073df14ec4ff-kube-api-access-schtp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.598432 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:35 crc kubenswrapper[4822]: I1010 06:36:35.833421 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw"] Oct 10 06:36:36 crc kubenswrapper[4822]: I1010 06:36:36.777766 4822 generic.go:334] "Generic (PLEG): container finished" podID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerID="34ae341a27ad585a865ebb87fb9ae1e49183f0049ef60c4889670f00b2b7008d" exitCode=0 Oct 10 06:36:36 crc kubenswrapper[4822]: I1010 06:36:36.777904 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" event={"ID":"982470a6-c29e-4e2a-a83b-073df14ec4ff","Type":"ContainerDied","Data":"34ae341a27ad585a865ebb87fb9ae1e49183f0049ef60c4889670f00b2b7008d"} Oct 10 06:36:36 crc kubenswrapper[4822]: I1010 06:36:36.778084 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" event={"ID":"982470a6-c29e-4e2a-a83b-073df14ec4ff","Type":"ContainerStarted","Data":"e34eb45e1896aa3f8d614bb870be4d8d95c8d21b9d4ed1719c193a815e401629"} Oct 10 06:36:38 crc kubenswrapper[4822]: I1010 06:36:38.794324 4822 generic.go:334] "Generic (PLEG): container finished" podID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerID="8125a3896e36db2aa337feee1c5a1fa3471840248364d961cc720000c7d72a94" exitCode=0 Oct 10 06:36:38 crc kubenswrapper[4822]: I1010 06:36:38.794390 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" event={"ID":"982470a6-c29e-4e2a-a83b-073df14ec4ff","Type":"ContainerDied","Data":"8125a3896e36db2aa337feee1c5a1fa3471840248364d961cc720000c7d72a94"} Oct 10 06:36:39 crc kubenswrapper[4822]: I1010 06:36:39.805007 4822 generic.go:334] "Generic (PLEG): container finished" podID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerID="ce30472adbf25df4765b9e113e3b927f7940b45d17e9ca2a770f3109037afae7" exitCode=0 Oct 10 06:36:39 crc kubenswrapper[4822]: I1010 06:36:39.805079 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" event={"ID":"982470a6-c29e-4e2a-a83b-073df14ec4ff","Type":"ContainerDied","Data":"ce30472adbf25df4765b9e113e3b927f7940b45d17e9ca2a770f3109037afae7"} Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.051184 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.076903 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-bundle\") pod \"982470a6-c29e-4e2a-a83b-073df14ec4ff\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.076957 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-schtp\" (UniqueName: \"kubernetes.io/projected/982470a6-c29e-4e2a-a83b-073df14ec4ff-kube-api-access-schtp\") pod \"982470a6-c29e-4e2a-a83b-073df14ec4ff\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.077077 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-util\") pod \"982470a6-c29e-4e2a-a83b-073df14ec4ff\" (UID: \"982470a6-c29e-4e2a-a83b-073df14ec4ff\") " Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.083775 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982470a6-c29e-4e2a-a83b-073df14ec4ff-kube-api-access-schtp" (OuterVolumeSpecName: "kube-api-access-schtp") pod "982470a6-c29e-4e2a-a83b-073df14ec4ff" (UID: "982470a6-c29e-4e2a-a83b-073df14ec4ff"). InnerVolumeSpecName "kube-api-access-schtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.085448 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-bundle" (OuterVolumeSpecName: "bundle") pod "982470a6-c29e-4e2a-a83b-073df14ec4ff" (UID: "982470a6-c29e-4e2a-a83b-073df14ec4ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.090140 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-util" (OuterVolumeSpecName: "util") pod "982470a6-c29e-4e2a-a83b-073df14ec4ff" (UID: "982470a6-c29e-4e2a-a83b-073df14ec4ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.178754 4822 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-util\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.178811 4822 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/982470a6-c29e-4e2a-a83b-073df14ec4ff-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.178826 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-schtp\" (UniqueName: \"kubernetes.io/projected/982470a6-c29e-4e2a-a83b-073df14ec4ff-kube-api-access-schtp\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.821757 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" event={"ID":"982470a6-c29e-4e2a-a83b-073df14ec4ff","Type":"ContainerDied","Data":"e34eb45e1896aa3f8d614bb870be4d8d95c8d21b9d4ed1719c193a815e401629"} Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.821849 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e34eb45e1896aa3f8d614bb870be4d8d95c8d21b9d4ed1719c193a815e401629" Oct 10 06:36:41 crc kubenswrapper[4822]: I1010 06:36:41.821893 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.070847 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9kf5"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.071214 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerName="controller-manager" containerID="cri-o://c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef" gracePeriod=30 Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.201315 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.201603 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" podUID="05179c47-551e-4445-bf6e-1f328d5f024c" containerName="route-controller-manager" containerID="cri-o://60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6" gracePeriod=30 Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.314567 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2"] Oct 10 06:36:43 crc kubenswrapper[4822]: E1010 06:36:43.319042 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="extract" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.319071 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="extract" Oct 10 06:36:43 crc kubenswrapper[4822]: E1010 06:36:43.319084 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="util" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.319089 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="util" Oct 10 06:36:43 crc kubenswrapper[4822]: E1010 06:36:43.319100 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="pull" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.319106 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="pull" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.319203 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="982470a6-c29e-4e2a-a83b-073df14ec4ff" containerName="extract" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.319634 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.325309 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-k62z7" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.325323 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.325670 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.334102 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.409856 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82sj\" (UniqueName: \"kubernetes.io/projected/7b65d092-89df-47ee-81f0-c48ab056e714-kube-api-access-t82sj\") pod \"nmstate-operator-858ddd8f98-mmtq2\" (UID: \"7b65d092-89df-47ee-81f0-c48ab056e714\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.511678 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82sj\" (UniqueName: \"kubernetes.io/projected/7b65d092-89df-47ee-81f0-c48ab056e714-kube-api-access-t82sj\") pod \"nmstate-operator-858ddd8f98-mmtq2\" (UID: \"7b65d092-89df-47ee-81f0-c48ab056e714\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.535701 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82sj\" (UniqueName: \"kubernetes.io/projected/7b65d092-89df-47ee-81f0-c48ab056e714-kube-api-access-t82sj\") pod \"nmstate-operator-858ddd8f98-mmtq2\" (UID: \"7b65d092-89df-47ee-81f0-c48ab056e714\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.559889 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.612147 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.612204 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-serving-cert\") pod \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.612248 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-client-ca\") pod \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.612309 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-config\") pod \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.613223 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-client-ca" (OuterVolumeSpecName: "client-ca") pod "0c5524a5-19e5-425c-b94d-c6fd6c4fd916" (UID: "0c5524a5-19e5-425c-b94d-c6fd6c4fd916"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.613323 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-config" (OuterVolumeSpecName: "config") pod "0c5524a5-19e5-425c-b94d-c6fd6c4fd916" (UID: "0c5524a5-19e5-425c-b94d-c6fd6c4fd916"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.613419 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmx68\" (UniqueName: \"kubernetes.io/projected/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-kube-api-access-bmx68\") pod \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.613973 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-proxy-ca-bundles\") pod \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\" (UID: \"0c5524a5-19e5-425c-b94d-c6fd6c4fd916\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.614259 4822 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.614274 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.614997 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0c5524a5-19e5-425c-b94d-c6fd6c4fd916" (UID: "0c5524a5-19e5-425c-b94d-c6fd6c4fd916"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.617122 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0c5524a5-19e5-425c-b94d-c6fd6c4fd916" (UID: "0c5524a5-19e5-425c-b94d-c6fd6c4fd916"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.617481 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-kube-api-access-bmx68" (OuterVolumeSpecName: "kube-api-access-bmx68") pod "0c5524a5-19e5-425c-b94d-c6fd6c4fd916" (UID: "0c5524a5-19e5-425c-b94d-c6fd6c4fd916"). InnerVolumeSpecName "kube-api-access-bmx68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.653356 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.714683 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rclk4\" (UniqueName: \"kubernetes.io/projected/05179c47-551e-4445-bf6e-1f328d5f024c-kube-api-access-rclk4\") pod \"05179c47-551e-4445-bf6e-1f328d5f024c\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.715337 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-config\") pod \"05179c47-551e-4445-bf6e-1f328d5f024c\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.715534 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05179c47-551e-4445-bf6e-1f328d5f024c-serving-cert\") pod \"05179c47-551e-4445-bf6e-1f328d5f024c\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.715688 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-client-ca\") pod \"05179c47-551e-4445-bf6e-1f328d5f024c\" (UID: \"05179c47-551e-4445-bf6e-1f328d5f024c\") " Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.716056 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.716156 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmx68\" (UniqueName: \"kubernetes.io/projected/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-kube-api-access-bmx68\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.716232 4822 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0c5524a5-19e5-425c-b94d-c6fd6c4fd916-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.716927 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-config" (OuterVolumeSpecName: "config") pod "05179c47-551e-4445-bf6e-1f328d5f024c" (UID: "05179c47-551e-4445-bf6e-1f328d5f024c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.717535 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-client-ca" (OuterVolumeSpecName: "client-ca") pod "05179c47-551e-4445-bf6e-1f328d5f024c" (UID: "05179c47-551e-4445-bf6e-1f328d5f024c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.719220 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05179c47-551e-4445-bf6e-1f328d5f024c-kube-api-access-rclk4" (OuterVolumeSpecName: "kube-api-access-rclk4") pod "05179c47-551e-4445-bf6e-1f328d5f024c" (UID: "05179c47-551e-4445-bf6e-1f328d5f024c"). InnerVolumeSpecName "kube-api-access-rclk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.719430 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05179c47-551e-4445-bf6e-1f328d5f024c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "05179c47-551e-4445-bf6e-1f328d5f024c" (UID: "05179c47-551e-4445-bf6e-1f328d5f024c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.817568 4822 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.817593 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rclk4\" (UniqueName: \"kubernetes.io/projected/05179c47-551e-4445-bf6e-1f328d5f024c-kube-api-access-rclk4\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.817603 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05179c47-551e-4445-bf6e-1f328d5f024c-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.817611 4822 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05179c47-551e-4445-bf6e-1f328d5f024c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.832240 4822 generic.go:334] "Generic (PLEG): container finished" podID="05179c47-551e-4445-bf6e-1f328d5f024c" containerID="60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6" exitCode=0 Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.832305 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" event={"ID":"05179c47-551e-4445-bf6e-1f328d5f024c","Type":"ContainerDied","Data":"60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6"} Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.832335 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" event={"ID":"05179c47-551e-4445-bf6e-1f328d5f024c","Type":"ContainerDied","Data":"9d93d429b3e8640fa93ad12b976813361a50485aa0224f36ddcf1642fe2cc85d"} Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.832356 4822 scope.go:117] "RemoveContainer" containerID="60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.832459 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.841134 4822 generic.go:334] "Generic (PLEG): container finished" podID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerID="c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef" exitCode=0 Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.841176 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" event={"ID":"0c5524a5-19e5-425c-b94d-c6fd6c4fd916","Type":"ContainerDied","Data":"c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef"} Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.841202 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" event={"ID":"0c5524a5-19e5-425c-b94d-c6fd6c4fd916","Type":"ContainerDied","Data":"eea2134a529864bb8dec11c0c771ef6bee89bd3084bfced3ec8b854510a7a735"} Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.841256 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9kf5" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.864550 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9kf5"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.864938 4822 scope.go:117] "RemoveContainer" containerID="60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6" Oct 10 06:36:43 crc kubenswrapper[4822]: E1010 06:36:43.868229 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6\": container with ID starting with 60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6 not found: ID does not exist" containerID="60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.868382 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6"} err="failed to get container status \"60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6\": rpc error: code = NotFound desc = could not find container \"60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6\": container with ID starting with 60d8157a58961149b428f4b3ad64d62d59b63a77b4af1efbd8d001e5ac3740b6 not found: ID does not exist" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.868508 4822 scope.go:117] "RemoveContainer" containerID="c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.869409 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9kf5"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.872919 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.883070 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.885828 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wfs5s"] Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.885890 4822 scope.go:117] "RemoveContainer" containerID="c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef" Oct 10 06:36:43 crc kubenswrapper[4822]: E1010 06:36:43.886338 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef\": container with ID starting with c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef not found: ID does not exist" containerID="c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef" Oct 10 06:36:43 crc kubenswrapper[4822]: I1010 06:36:43.886369 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef"} err="failed to get container status \"c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef\": rpc error: code = NotFound desc = could not find container \"c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef\": container with ID starting with c277349d0e2a6b759b31f53aa1f4913e531e87d828cf1b7138101ebc201a24ef not found: ID does not exist" Oct 10 06:36:43 crc kubenswrapper[4822]: W1010 06:36:43.888997 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b65d092_89df_47ee_81f0_c48ab056e714.slice/crio-e7bc7cc57f3d30d50447d74bc9a35353f51eb309434f91639fbdd0a3e5af05b2 WatchSource:0}: Error finding container e7bc7cc57f3d30d50447d74bc9a35353f51eb309434f91639fbdd0a3e5af05b2: Status 404 returned error can't find the container with id e7bc7cc57f3d30d50447d74bc9a35353f51eb309434f91639fbdd0a3e5af05b2 Oct 10 06:36:44 crc kubenswrapper[4822]: I1010 06:36:44.850545 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" event={"ID":"7b65d092-89df-47ee-81f0-c48ab056e714","Type":"ContainerStarted","Data":"e7bc7cc57f3d30d50447d74bc9a35353f51eb309434f91639fbdd0a3e5af05b2"} Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.209362 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg"] Oct 10 06:36:45 crc kubenswrapper[4822]: E1010 06:36:45.209686 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerName="controller-manager" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.209711 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerName="controller-manager" Oct 10 06:36:45 crc kubenswrapper[4822]: E1010 06:36:45.209732 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05179c47-551e-4445-bf6e-1f328d5f024c" containerName="route-controller-manager" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.209744 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="05179c47-551e-4445-bf6e-1f328d5f024c" containerName="route-controller-manager" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.209925 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" containerName="controller-manager" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.209945 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="05179c47-551e-4445-bf6e-1f328d5f024c" containerName="route-controller-manager" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.210507 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.212849 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59869d5b9-ml8zv"] Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.213587 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.213638 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.216398 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.216456 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.216543 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.216566 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.216713 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.221113 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.221194 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.221423 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.221877 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.225533 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg"] Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.227761 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59869d5b9-ml8zv"] Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.229388 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.229388 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.234821 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247027 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4r6q\" (UniqueName: \"kubernetes.io/projected/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-kube-api-access-k4r6q\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247105 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-serving-cert\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247137 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-client-ca\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247228 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-client-ca\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247289 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpwsg\" (UniqueName: \"kubernetes.io/projected/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-kube-api-access-dpwsg\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247351 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-config\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247386 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-serving-cert\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247434 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-config\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.247461 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-proxy-ca-bundles\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.348462 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4r6q\" (UniqueName: \"kubernetes.io/projected/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-kube-api-access-k4r6q\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.348937 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-client-ca\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.348954 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-serving-cert\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350072 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-client-ca\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350231 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-client-ca\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350345 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpwsg\" (UniqueName: \"kubernetes.io/projected/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-kube-api-access-dpwsg\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350440 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-serving-cert\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350480 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-config\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350523 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-config\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.350555 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-proxy-ca-bundles\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.351035 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-client-ca\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.352385 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-config\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.352406 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-config\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.353232 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-proxy-ca-bundles\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.357157 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-serving-cert\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.358024 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-serving-cert\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.366028 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4r6q\" (UniqueName: \"kubernetes.io/projected/d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6-kube-api-access-k4r6q\") pod \"controller-manager-59869d5b9-ml8zv\" (UID: \"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6\") " pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.367106 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpwsg\" (UniqueName: \"kubernetes.io/projected/9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72-kube-api-access-dpwsg\") pod \"route-controller-manager-84dfbc9688-sw6cg\" (UID: \"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72\") " pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.537855 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.551145 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.683972 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05179c47-551e-4445-bf6e-1f328d5f024c" path="/var/lib/kubelet/pods/05179c47-551e-4445-bf6e-1f328d5f024c/volumes" Oct 10 06:36:45 crc kubenswrapper[4822]: I1010 06:36:45.684553 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c5524a5-19e5-425c-b94d-c6fd6c4fd916" path="/var/lib/kubelet/pods/0c5524a5-19e5-425c-b94d-c6fd6c4fd916/volumes" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.065592 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59869d5b9-ml8zv"] Oct 10 06:36:46 crc kubenswrapper[4822]: W1010 06:36:46.079742 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b5fa11_96fb_4bbb_a7eb_0cbabc68d4a6.slice/crio-fbabe88b8b4e77468d32ae2a9b30d98e2fb1aeb0b5aeb2eac00c4bb5186ae17f WatchSource:0}: Error finding container fbabe88b8b4e77468d32ae2a9b30d98e2fb1aeb0b5aeb2eac00c4bb5186ae17f: Status 404 returned error can't find the container with id fbabe88b8b4e77468d32ae2a9b30d98e2fb1aeb0b5aeb2eac00c4bb5186ae17f Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.245652 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg"] Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.874880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" event={"ID":"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72","Type":"ContainerStarted","Data":"05b1652925cbbfa118036f80a437ca8864f7104a00123b1bd1903f3565c7f459"} Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.875237 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" event={"ID":"9e1a5fe0-7b29-49c6-921b-b4d7d4ef7b72","Type":"ContainerStarted","Data":"ac68b3263b0236fb5c550d8028690179e4bb5e635e965fff011d161b3656b115"} Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.875255 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.876590 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" event={"ID":"7b65d092-89df-47ee-81f0-c48ab056e714","Type":"ContainerStarted","Data":"0e5c08931a910c6dc5204506b047aaccf38e15bcd9dc076b03465d2044a392e6"} Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.877833 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" event={"ID":"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6","Type":"ContainerStarted","Data":"c7be9d8408000ab80d720831b1866c1b0fa6f5bd7422ad4fd6382b34439dac8a"} Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.877887 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" event={"ID":"d2b5fa11-96fb-4bbb-a7eb-0cbabc68d4a6","Type":"ContainerStarted","Data":"fbabe88b8b4e77468d32ae2a9b30d98e2fb1aeb0b5aeb2eac00c4bb5186ae17f"} Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.878073 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.883013 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.883221 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.893321 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84dfbc9688-sw6cg" podStartSLOduration=3.893303151 podStartE2EDuration="3.893303151s" podCreationTimestamp="2025-10-10 06:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:36:46.892328061 +0000 UTC m=+753.987486267" watchObservedRunningTime="2025-10-10 06:36:46.893303151 +0000 UTC m=+753.988461357" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.937991 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59869d5b9-ml8zv" podStartSLOduration=3.937972761 podStartE2EDuration="3.937972761s" podCreationTimestamp="2025-10-10 06:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:36:46.936822616 +0000 UTC m=+754.031980842" watchObservedRunningTime="2025-10-10 06:36:46.937972761 +0000 UTC m=+754.033130967" Oct 10 06:36:46 crc kubenswrapper[4822]: I1010 06:36:46.956512 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mmtq2" podStartSLOduration=1.905268728 podStartE2EDuration="3.956491745s" podCreationTimestamp="2025-10-10 06:36:43 +0000 UTC" firstStartedPulling="2025-10-10 06:36:43.891262465 +0000 UTC m=+750.986420661" lastFinishedPulling="2025-10-10 06:36:45.942485482 +0000 UTC m=+753.037643678" observedRunningTime="2025-10-10 06:36:46.952239306 +0000 UTC m=+754.047397542" watchObservedRunningTime="2025-10-10 06:36:46.956491745 +0000 UTC m=+754.051649941" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.865881 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn"] Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.866745 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.874269 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tbghs" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.882901 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn"] Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.886650 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h"] Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.887588 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.890406 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.907991 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lkp99"] Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.908744 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.909331 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h"] Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989444 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-nmstate-lock\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989509 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-dbus-socket\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989534 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-ovs-socket\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989603 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj42k\" (UniqueName: \"kubernetes.io/projected/6482591b-0943-4d7c-90ac-054449e582ba-kube-api-access-bj42k\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989684 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flx4t\" (UniqueName: \"kubernetes.io/projected/e6165414-b0cd-4f52-a9bc-da894b2cf483-kube-api-access-flx4t\") pod \"nmstate-metrics-fdff9cb8d-6xzcn\" (UID: \"e6165414-b0cd-4f52-a9bc-da894b2cf483\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989739 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kzq\" (UniqueName: \"kubernetes.io/projected/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-kube-api-access-z2kzq\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:47 crc kubenswrapper[4822]: I1010 06:36:47.989775 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.030752 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc"] Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.031681 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.045056 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.045310 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hphhk" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.045427 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.048851 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc"] Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090502 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/044d3970-563c-4037-8388-21e91330f82c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090580 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-dbus-socket\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090603 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-nmstate-lock\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090624 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-ovs-socket\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090651 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj42k\" (UniqueName: \"kubernetes.io/projected/6482591b-0943-4d7c-90ac-054449e582ba-kube-api-access-bj42k\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090704 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flx4t\" (UniqueName: \"kubernetes.io/projected/e6165414-b0cd-4f52-a9bc-da894b2cf483-kube-api-access-flx4t\") pod \"nmstate-metrics-fdff9cb8d-6xzcn\" (UID: \"e6165414-b0cd-4f52-a9bc-da894b2cf483\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-nmstate-lock\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090729 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/044d3970-563c-4037-8388-21e91330f82c-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090790 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-ovs-socket\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090794 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kzq\" (UniqueName: \"kubernetes.io/projected/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-kube-api-access-z2kzq\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090859 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.090897 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfvmb\" (UniqueName: \"kubernetes.io/projected/044d3970-563c-4037-8388-21e91330f82c-kube-api-access-vfvmb\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.091372 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6482591b-0943-4d7c-90ac-054449e582ba-dbus-socket\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: E1010 06:36:48.091414 4822 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 10 06:36:48 crc kubenswrapper[4822]: E1010 06:36:48.091466 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-tls-key-pair podName:e5c07223-93dd-414b-b8c2-c177e0e8c4e9 nodeName:}" failed. No retries permitted until 2025-10-10 06:36:48.591447804 +0000 UTC m=+755.686606000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-tls-key-pair") pod "nmstate-webhook-6cdbc54649-z4s2h" (UID: "e5c07223-93dd-414b-b8c2-c177e0e8c4e9") : secret "openshift-nmstate-webhook" not found Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.116775 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flx4t\" (UniqueName: \"kubernetes.io/projected/e6165414-b0cd-4f52-a9bc-da894b2cf483-kube-api-access-flx4t\") pod \"nmstate-metrics-fdff9cb8d-6xzcn\" (UID: \"e6165414-b0cd-4f52-a9bc-da894b2cf483\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.117682 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj42k\" (UniqueName: \"kubernetes.io/projected/6482591b-0943-4d7c-90ac-054449e582ba-kube-api-access-bj42k\") pod \"nmstate-handler-lkp99\" (UID: \"6482591b-0943-4d7c-90ac-054449e582ba\") " pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.124776 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kzq\" (UniqueName: \"kubernetes.io/projected/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-kube-api-access-z2kzq\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.185876 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.191855 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfvmb\" (UniqueName: \"kubernetes.io/projected/044d3970-563c-4037-8388-21e91330f82c-kube-api-access-vfvmb\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.191925 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/044d3970-563c-4037-8388-21e91330f82c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.192064 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/044d3970-563c-4037-8388-21e91330f82c-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.193312 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/044d3970-563c-4037-8388-21e91330f82c-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.195394 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/044d3970-563c-4037-8388-21e91330f82c-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.224823 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.234868 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfvmb\" (UniqueName: \"kubernetes.io/projected/044d3970-563c-4037-8388-21e91330f82c-kube-api-access-vfvmb\") pod \"nmstate-console-plugin-6b874cbd85-4w4wc\" (UID: \"044d3970-563c-4037-8388-21e91330f82c\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: W1010 06:36:48.262992 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6482591b_0943_4d7c_90ac_054449e582ba.slice/crio-54393db89eb4594e5f6f64d2091a55d72d54e04f2eb7467464dd9839bf87d735 WatchSource:0}: Error finding container 54393db89eb4594e5f6f64d2091a55d72d54e04f2eb7467464dd9839bf87d735: Status 404 returned error can't find the container with id 54393db89eb4594e5f6f64d2091a55d72d54e04f2eb7467464dd9839bf87d735 Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.329236 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-786f49778b-j88j6"] Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.330380 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.350597 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-786f49778b-j88j6"] Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.358079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395010 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/772cc447-efda-4c38-b1bc-a666c1657a81-console-serving-cert\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395072 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbgt8\" (UniqueName: \"kubernetes.io/projected/772cc447-efda-4c38-b1bc-a666c1657a81-kube-api-access-hbgt8\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395131 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-trusted-ca-bundle\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395165 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-oauth-serving-cert\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395198 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-console-config\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/772cc447-efda-4c38-b1bc-a666c1657a81-console-oauth-config\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.395364 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-service-ca\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.502710 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/772cc447-efda-4c38-b1bc-a666c1657a81-console-serving-cert\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.503638 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbgt8\" (UniqueName: \"kubernetes.io/projected/772cc447-efda-4c38-b1bc-a666c1657a81-kube-api-access-hbgt8\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.503705 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-trusted-ca-bundle\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.503750 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-oauth-serving-cert\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.503792 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-console-config\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.503838 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/772cc447-efda-4c38-b1bc-a666c1657a81-console-oauth-config\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.503947 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-service-ca\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.504722 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-trusted-ca-bundle\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.504900 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-service-ca\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.504934 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-console-config\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.504930 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/772cc447-efda-4c38-b1bc-a666c1657a81-oauth-serving-cert\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.509720 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/772cc447-efda-4c38-b1bc-a666c1657a81-console-serving-cert\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.513303 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/772cc447-efda-4c38-b1bc-a666c1657a81-console-oauth-config\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.521178 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbgt8\" (UniqueName: \"kubernetes.io/projected/772cc447-efda-4c38-b1bc-a666c1657a81-kube-api-access-hbgt8\") pod \"console-786f49778b-j88j6\" (UID: \"772cc447-efda-4c38-b1bc-a666c1657a81\") " pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.605660 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.609129 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e5c07223-93dd-414b-b8c2-c177e0e8c4e9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-z4s2h\" (UID: \"e5c07223-93dd-414b-b8c2-c177e0e8c4e9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.621720 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc"] Oct 10 06:36:48 crc kubenswrapper[4822]: W1010 06:36:48.625561 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod044d3970_563c_4037_8388_21e91330f82c.slice/crio-a8cf94d8a520c59fd4b6bd4466df2cfb7ce93328a7d94cfc10acd26bb9f7609f WatchSource:0}: Error finding container a8cf94d8a520c59fd4b6bd4466df2cfb7ce93328a7d94cfc10acd26bb9f7609f: Status 404 returned error can't find the container with id a8cf94d8a520c59fd4b6bd4466df2cfb7ce93328a7d94cfc10acd26bb9f7609f Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.644191 4822 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.664049 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.704753 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn"] Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.804078 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.904078 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" event={"ID":"044d3970-563c-4037-8388-21e91330f82c","Type":"ContainerStarted","Data":"a8cf94d8a520c59fd4b6bd4466df2cfb7ce93328a7d94cfc10acd26bb9f7609f"} Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.905255 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lkp99" event={"ID":"6482591b-0943-4d7c-90ac-054449e582ba","Type":"ContainerStarted","Data":"54393db89eb4594e5f6f64d2091a55d72d54e04f2eb7467464dd9839bf87d735"} Oct 10 06:36:48 crc kubenswrapper[4822]: I1010 06:36:48.907170 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" event={"ID":"e6165414-b0cd-4f52-a9bc-da894b2cf483","Type":"ContainerStarted","Data":"e9dc835e0a495e84594b9e3d40020b60c253731ee99905492d47924188db21ff"} Oct 10 06:36:49 crc kubenswrapper[4822]: I1010 06:36:49.135496 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-786f49778b-j88j6"] Oct 10 06:36:49 crc kubenswrapper[4822]: W1010 06:36:49.139162 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772cc447_efda_4c38_b1bc_a666c1657a81.slice/crio-3d2902fec89dcf0eed0dd85856c6ced2caf2dfd6439e852513bfb209914b9dcb WatchSource:0}: Error finding container 3d2902fec89dcf0eed0dd85856c6ced2caf2dfd6439e852513bfb209914b9dcb: Status 404 returned error can't find the container with id 3d2902fec89dcf0eed0dd85856c6ced2caf2dfd6439e852513bfb209914b9dcb Oct 10 06:36:49 crc kubenswrapper[4822]: I1010 06:36:49.243466 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h"] Oct 10 06:36:49 crc kubenswrapper[4822]: W1010 06:36:49.250176 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c07223_93dd_414b_b8c2_c177e0e8c4e9.slice/crio-ac93d8f9a27c7c3a2ab92ccaa8d234222b8098a6d17f28b27884bdd11162abba WatchSource:0}: Error finding container ac93d8f9a27c7c3a2ab92ccaa8d234222b8098a6d17f28b27884bdd11162abba: Status 404 returned error can't find the container with id ac93d8f9a27c7c3a2ab92ccaa8d234222b8098a6d17f28b27884bdd11162abba Oct 10 06:36:49 crc kubenswrapper[4822]: I1010 06:36:49.916126 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-786f49778b-j88j6" event={"ID":"772cc447-efda-4c38-b1bc-a666c1657a81","Type":"ContainerStarted","Data":"4ed8dcb2624598e164149562afaf4ce98721f8df41109667b0e0fe098529a125"} Oct 10 06:36:49 crc kubenswrapper[4822]: I1010 06:36:49.916167 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-786f49778b-j88j6" event={"ID":"772cc447-efda-4c38-b1bc-a666c1657a81","Type":"ContainerStarted","Data":"3d2902fec89dcf0eed0dd85856c6ced2caf2dfd6439e852513bfb209914b9dcb"} Oct 10 06:36:49 crc kubenswrapper[4822]: I1010 06:36:49.917330 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" event={"ID":"e5c07223-93dd-414b-b8c2-c177e0e8c4e9","Type":"ContainerStarted","Data":"ac93d8f9a27c7c3a2ab92ccaa8d234222b8098a6d17f28b27884bdd11162abba"} Oct 10 06:36:49 crc kubenswrapper[4822]: I1010 06:36:49.934461 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-786f49778b-j88j6" podStartSLOduration=1.9344427560000002 podStartE2EDuration="1.934442756s" podCreationTimestamp="2025-10-10 06:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:36:49.934289202 +0000 UTC m=+757.029447408" watchObservedRunningTime="2025-10-10 06:36:49.934442756 +0000 UTC m=+757.029600952" Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.929104 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" event={"ID":"e6165414-b0cd-4f52-a9bc-da894b2cf483","Type":"ContainerStarted","Data":"2c515ade6c0811a0fe2141964f05fdc43b2e04b898cc1195d126dba49eb5dd3c"} Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.931755 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" event={"ID":"e5c07223-93dd-414b-b8c2-c177e0e8c4e9","Type":"ContainerStarted","Data":"775a0f4c6d992fe1836d2362eb593b804cccd2fe8ea1513ebf52fb19f96fccc5"} Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.932600 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.935383 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" event={"ID":"044d3970-563c-4037-8388-21e91330f82c","Type":"ContainerStarted","Data":"ffc82e8a4893a6836b1aac2c808bb0eb1e1bdeee6241512d50d27f0ab77418bf"} Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.937361 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lkp99" event={"ID":"6482591b-0943-4d7c-90ac-054449e582ba","Type":"ContainerStarted","Data":"5e26b39ee01aeaa6a0bab0a59b5c6e1bdcd3fe67b49269022426a8f508988b19"} Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.937778 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:51 crc kubenswrapper[4822]: I1010 06:36:51.985963 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" podStartSLOduration=2.797908369 podStartE2EDuration="4.985943581s" podCreationTimestamp="2025-10-10 06:36:47 +0000 UTC" firstStartedPulling="2025-10-10 06:36:49.252866718 +0000 UTC m=+756.348024924" lastFinishedPulling="2025-10-10 06:36:51.44090193 +0000 UTC m=+758.536060136" observedRunningTime="2025-10-10 06:36:51.961280689 +0000 UTC m=+759.056438885" watchObservedRunningTime="2025-10-10 06:36:51.985943581 +0000 UTC m=+759.081101787" Oct 10 06:36:52 crc kubenswrapper[4822]: I1010 06:36:52.007054 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4w4wc" podStartSLOduration=1.194290314 podStartE2EDuration="4.007031463s" podCreationTimestamp="2025-10-10 06:36:48 +0000 UTC" firstStartedPulling="2025-10-10 06:36:48.627913993 +0000 UTC m=+755.723072189" lastFinishedPulling="2025-10-10 06:36:51.440655142 +0000 UTC m=+758.535813338" observedRunningTime="2025-10-10 06:36:51.984968261 +0000 UTC m=+759.080126477" watchObservedRunningTime="2025-10-10 06:36:52.007031463 +0000 UTC m=+759.102189659" Oct 10 06:36:53 crc kubenswrapper[4822]: I1010 06:36:53.669702 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lkp99" podStartSLOduration=3.499837957 podStartE2EDuration="6.669684293s" podCreationTimestamp="2025-10-10 06:36:47 +0000 UTC" firstStartedPulling="2025-10-10 06:36:48.265306389 +0000 UTC m=+755.360464585" lastFinishedPulling="2025-10-10 06:36:51.435152725 +0000 UTC m=+758.530310921" observedRunningTime="2025-10-10 06:36:52.010457947 +0000 UTC m=+759.105616153" watchObservedRunningTime="2025-10-10 06:36:53.669684293 +0000 UTC m=+760.764842489" Oct 10 06:36:53 crc kubenswrapper[4822]: I1010 06:36:53.968273 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" event={"ID":"e6165414-b0cd-4f52-a9bc-da894b2cf483","Type":"ContainerStarted","Data":"9c1a1056b7737b2346111c826a681c3fabc687fde85974a63e83d9ae4ebe0987"} Oct 10 06:36:54 crc kubenswrapper[4822]: I1010 06:36:54.000614 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6xzcn" podStartSLOduration=2.125667654 podStartE2EDuration="7.00055995s" podCreationTimestamp="2025-10-10 06:36:47 +0000 UTC" firstStartedPulling="2025-10-10 06:36:48.736011136 +0000 UTC m=+755.831169332" lastFinishedPulling="2025-10-10 06:36:53.610903432 +0000 UTC m=+760.706061628" observedRunningTime="2025-10-10 06:36:53.999214629 +0000 UTC m=+761.094372855" watchObservedRunningTime="2025-10-10 06:36:54.00055995 +0000 UTC m=+761.095718146" Oct 10 06:36:58 crc kubenswrapper[4822]: I1010 06:36:58.283404 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lkp99" Oct 10 06:36:58 crc kubenswrapper[4822]: I1010 06:36:58.665195 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:58 crc kubenswrapper[4822]: I1010 06:36:58.665307 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:58 crc kubenswrapper[4822]: I1010 06:36:58.674410 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.008081 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-786f49778b-j88j6" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.109147 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kvjlx"] Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.317978 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4phn"] Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.319225 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.331228 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4phn"] Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.482995 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-catalog-content\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.483206 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5pz\" (UniqueName: \"kubernetes.io/projected/2c0a976a-e2a7-4fd9-87a3-992190236095-kube-api-access-5j5pz\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.483272 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-utilities\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.584985 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5pz\" (UniqueName: \"kubernetes.io/projected/2c0a976a-e2a7-4fd9-87a3-992190236095-kube-api-access-5j5pz\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.585029 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-utilities\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.585094 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-catalog-content\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.585545 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-catalog-content\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.585631 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-utilities\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.606571 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5pz\" (UniqueName: \"kubernetes.io/projected/2c0a976a-e2a7-4fd9-87a3-992190236095-kube-api-access-5j5pz\") pod \"redhat-operators-h4phn\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:36:59 crc kubenswrapper[4822]: I1010 06:36:59.635554 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:00 crc kubenswrapper[4822]: I1010 06:37:00.042275 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4phn"] Oct 10 06:37:00 crc kubenswrapper[4822]: W1010 06:37:00.055985 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c0a976a_e2a7_4fd9_87a3_992190236095.slice/crio-550c5b27def9d1b345050feb7a6a34ceb5c88d653116d98344c4f4ff35eece33 WatchSource:0}: Error finding container 550c5b27def9d1b345050feb7a6a34ceb5c88d653116d98344c4f4ff35eece33: Status 404 returned error can't find the container with id 550c5b27def9d1b345050feb7a6a34ceb5c88d653116d98344c4f4ff35eece33 Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.018396 4822 generic.go:334] "Generic (PLEG): container finished" podID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerID="22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72" exitCode=0 Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.018463 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4phn" event={"ID":"2c0a976a-e2a7-4fd9-87a3-992190236095","Type":"ContainerDied","Data":"22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72"} Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.018504 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4phn" event={"ID":"2c0a976a-e2a7-4fd9-87a3-992190236095","Type":"ContainerStarted","Data":"550c5b27def9d1b345050feb7a6a34ceb5c88d653116d98344c4f4ff35eece33"} Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.337343 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.338071 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.338298 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.339257 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58cc4a6333405580cf6ed60b3760010cf9fb05805218283cbce7d19149e2db60"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:37:01 crc kubenswrapper[4822]: I1010 06:37:01.339475 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://58cc4a6333405580cf6ed60b3760010cf9fb05805218283cbce7d19149e2db60" gracePeriod=600 Oct 10 06:37:02 crc kubenswrapper[4822]: I1010 06:37:02.030272 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="58cc4a6333405580cf6ed60b3760010cf9fb05805218283cbce7d19149e2db60" exitCode=0 Oct 10 06:37:02 crc kubenswrapper[4822]: I1010 06:37:02.030337 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"58cc4a6333405580cf6ed60b3760010cf9fb05805218283cbce7d19149e2db60"} Oct 10 06:37:02 crc kubenswrapper[4822]: I1010 06:37:02.030466 4822 scope.go:117] "RemoveContainer" containerID="e955f367f73ed0e164cc92a00b251ea7a107033da0a713ece34f1828bba18bd1" Oct 10 06:37:03 crc kubenswrapper[4822]: I1010 06:37:03.038405 4822 generic.go:334] "Generic (PLEG): container finished" podID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerID="476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881" exitCode=0 Oct 10 06:37:03 crc kubenswrapper[4822]: I1010 06:37:03.038534 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4phn" event={"ID":"2c0a976a-e2a7-4fd9-87a3-992190236095","Type":"ContainerDied","Data":"476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881"} Oct 10 06:37:03 crc kubenswrapper[4822]: I1010 06:37:03.044062 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"3148af7555cf9f4072513a4f7349d4cc748c64df0fab673b49a83ef0fc2fe122"} Oct 10 06:37:04 crc kubenswrapper[4822]: I1010 06:37:04.053090 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4phn" event={"ID":"2c0a976a-e2a7-4fd9-87a3-992190236095","Type":"ContainerStarted","Data":"33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b"} Oct 10 06:37:04 crc kubenswrapper[4822]: I1010 06:37:04.079468 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4phn" podStartSLOduration=2.621222394 podStartE2EDuration="5.079445946s" podCreationTimestamp="2025-10-10 06:36:59 +0000 UTC" firstStartedPulling="2025-10-10 06:37:01.02220026 +0000 UTC m=+768.117358466" lastFinishedPulling="2025-10-10 06:37:03.480423802 +0000 UTC m=+770.575582018" observedRunningTime="2025-10-10 06:37:04.075933289 +0000 UTC m=+771.171091505" watchObservedRunningTime="2025-10-10 06:37:04.079445946 +0000 UTC m=+771.174604142" Oct 10 06:37:06 crc kubenswrapper[4822]: I1010 06:37:06.939684 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dr568"] Oct 10 06:37:06 crc kubenswrapper[4822]: I1010 06:37:06.942385 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.016709 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dr568"] Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.090172 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-catalog-content\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.090230 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-utilities\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.090265 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qbh\" (UniqueName: \"kubernetes.io/projected/f4b81a6f-7626-4d1e-8781-acace6cc4da9-kube-api-access-98qbh\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.191488 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-catalog-content\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.191549 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-utilities\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.191583 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qbh\" (UniqueName: \"kubernetes.io/projected/f4b81a6f-7626-4d1e-8781-acace6cc4da9-kube-api-access-98qbh\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.192133 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-catalog-content\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.192156 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-utilities\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.217190 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qbh\" (UniqueName: \"kubernetes.io/projected/f4b81a6f-7626-4d1e-8781-acace6cc4da9-kube-api-access-98qbh\") pod \"redhat-marketplace-dr568\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.275316 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:07 crc kubenswrapper[4822]: I1010 06:37:07.685301 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dr568"] Oct 10 06:37:07 crc kubenswrapper[4822]: W1010 06:37:07.693916 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b81a6f_7626_4d1e_8781_acace6cc4da9.slice/crio-e8f305f25f3b1dcbd384c796ef23b17de5d7601b8ba046b9270142ee1af9f760 WatchSource:0}: Error finding container e8f305f25f3b1dcbd384c796ef23b17de5d7601b8ba046b9270142ee1af9f760: Status 404 returned error can't find the container with id e8f305f25f3b1dcbd384c796ef23b17de5d7601b8ba046b9270142ee1af9f760 Oct 10 06:37:08 crc kubenswrapper[4822]: I1010 06:37:08.093632 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerID="b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44" exitCode=0 Oct 10 06:37:08 crc kubenswrapper[4822]: I1010 06:37:08.093762 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dr568" event={"ID":"f4b81a6f-7626-4d1e-8781-acace6cc4da9","Type":"ContainerDied","Data":"b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44"} Oct 10 06:37:08 crc kubenswrapper[4822]: I1010 06:37:08.094001 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dr568" event={"ID":"f4b81a6f-7626-4d1e-8781-acace6cc4da9","Type":"ContainerStarted","Data":"e8f305f25f3b1dcbd384c796ef23b17de5d7601b8ba046b9270142ee1af9f760"} Oct 10 06:37:08 crc kubenswrapper[4822]: I1010 06:37:08.809065 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-z4s2h" Oct 10 06:37:09 crc kubenswrapper[4822]: I1010 06:37:09.101565 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerID="d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5" exitCode=0 Oct 10 06:37:09 crc kubenswrapper[4822]: I1010 06:37:09.101602 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dr568" event={"ID":"f4b81a6f-7626-4d1e-8781-acace6cc4da9","Type":"ContainerDied","Data":"d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5"} Oct 10 06:37:09 crc kubenswrapper[4822]: I1010 06:37:09.636520 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:09 crc kubenswrapper[4822]: I1010 06:37:09.636925 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:09 crc kubenswrapper[4822]: I1010 06:37:09.683890 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:10 crc kubenswrapper[4822]: I1010 06:37:10.108957 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dr568" event={"ID":"f4b81a6f-7626-4d1e-8781-acace6cc4da9","Type":"ContainerStarted","Data":"a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8"} Oct 10 06:37:10 crc kubenswrapper[4822]: I1010 06:37:10.129028 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dr568" podStartSLOduration=2.5710253180000002 podStartE2EDuration="4.129012261s" podCreationTimestamp="2025-10-10 06:37:06 +0000 UTC" firstStartedPulling="2025-10-10 06:37:08.095632449 +0000 UTC m=+775.190790655" lastFinishedPulling="2025-10-10 06:37:09.653619402 +0000 UTC m=+776.748777598" observedRunningTime="2025-10-10 06:37:10.127688431 +0000 UTC m=+777.222846647" watchObservedRunningTime="2025-10-10 06:37:10.129012261 +0000 UTC m=+777.224170447" Oct 10 06:37:10 crc kubenswrapper[4822]: I1010 06:37:10.154414 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.097465 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4phn"] Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.121604 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4phn" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="registry-server" containerID="cri-o://33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b" gracePeriod=2 Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.594897 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.760193 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j5pz\" (UniqueName: \"kubernetes.io/projected/2c0a976a-e2a7-4fd9-87a3-992190236095-kube-api-access-5j5pz\") pod \"2c0a976a-e2a7-4fd9-87a3-992190236095\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.760267 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-catalog-content\") pod \"2c0a976a-e2a7-4fd9-87a3-992190236095\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.760353 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-utilities\") pod \"2c0a976a-e2a7-4fd9-87a3-992190236095\" (UID: \"2c0a976a-e2a7-4fd9-87a3-992190236095\") " Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.761549 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-utilities" (OuterVolumeSpecName: "utilities") pod "2c0a976a-e2a7-4fd9-87a3-992190236095" (UID: "2c0a976a-e2a7-4fd9-87a3-992190236095"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.767283 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0a976a-e2a7-4fd9-87a3-992190236095-kube-api-access-5j5pz" (OuterVolumeSpecName: "kube-api-access-5j5pz") pod "2c0a976a-e2a7-4fd9-87a3-992190236095" (UID: "2c0a976a-e2a7-4fd9-87a3-992190236095"). InnerVolumeSpecName "kube-api-access-5j5pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.861573 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j5pz\" (UniqueName: \"kubernetes.io/projected/2c0a976a-e2a7-4fd9-87a3-992190236095-kube-api-access-5j5pz\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.861629 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.910764 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c0a976a-e2a7-4fd9-87a3-992190236095" (UID: "2c0a976a-e2a7-4fd9-87a3-992190236095"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:12 crc kubenswrapper[4822]: I1010 06:37:12.962987 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a976a-e2a7-4fd9-87a3-992190236095-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.130467 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4phn" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.130495 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4phn" event={"ID":"2c0a976a-e2a7-4fd9-87a3-992190236095","Type":"ContainerDied","Data":"33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b"} Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.131092 4822 scope.go:117] "RemoveContainer" containerID="33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.130405 4822 generic.go:334] "Generic (PLEG): container finished" podID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerID="33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b" exitCode=0 Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.131197 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4phn" event={"ID":"2c0a976a-e2a7-4fd9-87a3-992190236095","Type":"ContainerDied","Data":"550c5b27def9d1b345050feb7a6a34ceb5c88d653116d98344c4f4ff35eece33"} Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.148765 4822 scope.go:117] "RemoveContainer" containerID="476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.174397 4822 scope.go:117] "RemoveContainer" containerID="22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.197994 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4phn"] Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.201487 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4phn"] Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.210580 4822 scope.go:117] "RemoveContainer" containerID="33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b" Oct 10 06:37:13 crc kubenswrapper[4822]: E1010 06:37:13.211004 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b\": container with ID starting with 33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b not found: ID does not exist" containerID="33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.211185 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b"} err="failed to get container status \"33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b\": rpc error: code = NotFound desc = could not find container \"33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b\": container with ID starting with 33ea0c615e7b7a8b7f6264b58ace523d03200d2d1ff030cc696387ed688ad07b not found: ID does not exist" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.211266 4822 scope.go:117] "RemoveContainer" containerID="476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881" Oct 10 06:37:13 crc kubenswrapper[4822]: E1010 06:37:13.211706 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881\": container with ID starting with 476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881 not found: ID does not exist" containerID="476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.211887 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881"} err="failed to get container status \"476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881\": rpc error: code = NotFound desc = could not find container \"476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881\": container with ID starting with 476fa1b20e56b9a89a5bed113e9937812daf81215574e89f83f0f0c44482f881 not found: ID does not exist" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.211968 4822 scope.go:117] "RemoveContainer" containerID="22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72" Oct 10 06:37:13 crc kubenswrapper[4822]: E1010 06:37:13.212497 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72\": container with ID starting with 22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72 not found: ID does not exist" containerID="22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.212545 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72"} err="failed to get container status \"22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72\": rpc error: code = NotFound desc = could not find container \"22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72\": container with ID starting with 22d2a98609f118abb61c4de1bb5ffa094414cb43093dab33040cac08dc53aa72 not found: ID does not exist" Oct 10 06:37:13 crc kubenswrapper[4822]: I1010 06:37:13.658595 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" path="/var/lib/kubelet/pods/2c0a976a-e2a7-4fd9-87a3-992190236095/volumes" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.113191 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qr5wq"] Oct 10 06:37:16 crc kubenswrapper[4822]: E1010 06:37:16.118104 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="extract-content" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.118127 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="extract-content" Oct 10 06:37:16 crc kubenswrapper[4822]: E1010 06:37:16.118152 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="registry-server" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.118162 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="registry-server" Oct 10 06:37:16 crc kubenswrapper[4822]: E1010 06:37:16.118179 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="extract-utilities" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.118186 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="extract-utilities" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.118340 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0a976a-e2a7-4fd9-87a3-992190236095" containerName="registry-server" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.119769 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.137536 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qr5wq"] Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.211456 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9cbz\" (UniqueName: \"kubernetes.io/projected/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-kube-api-access-m9cbz\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.211511 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-utilities\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.211534 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-catalog-content\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.321976 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9cbz\" (UniqueName: \"kubernetes.io/projected/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-kube-api-access-m9cbz\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.322066 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-utilities\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.322100 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-catalog-content\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.322618 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-catalog-content\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.322923 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-utilities\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.350148 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9cbz\" (UniqueName: \"kubernetes.io/projected/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-kube-api-access-m9cbz\") pod \"certified-operators-qr5wq\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.460758 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:16 crc kubenswrapper[4822]: I1010 06:37:16.935505 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qr5wq"] Oct 10 06:37:17 crc kubenswrapper[4822]: I1010 06:37:17.156578 4822 generic.go:334] "Generic (PLEG): container finished" podID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerID="c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30" exitCode=0 Oct 10 06:37:17 crc kubenswrapper[4822]: I1010 06:37:17.156662 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qr5wq" event={"ID":"8fb10804-5ba4-479a-b78b-3c9d1dec7feb","Type":"ContainerDied","Data":"c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30"} Oct 10 06:37:17 crc kubenswrapper[4822]: I1010 06:37:17.156893 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qr5wq" event={"ID":"8fb10804-5ba4-479a-b78b-3c9d1dec7feb","Type":"ContainerStarted","Data":"25e748e0d39403315aaf41659d03753070cbf3d96770ff25de2f9e479e4f9052"} Oct 10 06:37:17 crc kubenswrapper[4822]: I1010 06:37:17.276055 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:17 crc kubenswrapper[4822]: I1010 06:37:17.276098 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:17 crc kubenswrapper[4822]: I1010 06:37:17.312132 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:18 crc kubenswrapper[4822]: I1010 06:37:18.170956 4822 generic.go:334] "Generic (PLEG): container finished" podID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerID="020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6" exitCode=0 Oct 10 06:37:18 crc kubenswrapper[4822]: I1010 06:37:18.172041 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qr5wq" event={"ID":"8fb10804-5ba4-479a-b78b-3c9d1dec7feb","Type":"ContainerDied","Data":"020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6"} Oct 10 06:37:18 crc kubenswrapper[4822]: I1010 06:37:18.223711 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:19 crc kubenswrapper[4822]: I1010 06:37:19.178886 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qr5wq" event={"ID":"8fb10804-5ba4-479a-b78b-3c9d1dec7feb","Type":"ContainerStarted","Data":"c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e"} Oct 10 06:37:19 crc kubenswrapper[4822]: I1010 06:37:19.199128 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qr5wq" podStartSLOduration=1.625322652 podStartE2EDuration="3.199098163s" podCreationTimestamp="2025-10-10 06:37:16 +0000 UTC" firstStartedPulling="2025-10-10 06:37:17.157932269 +0000 UTC m=+784.253090465" lastFinishedPulling="2025-10-10 06:37:18.73170778 +0000 UTC m=+785.826865976" observedRunningTime="2025-10-10 06:37:19.195125385 +0000 UTC m=+786.290283581" watchObservedRunningTime="2025-10-10 06:37:19.199098163 +0000 UTC m=+786.294256359" Oct 10 06:37:19 crc kubenswrapper[4822]: I1010 06:37:19.697313 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dr568"] Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.184440 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dr568" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="registry-server" containerID="cri-o://a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8" gracePeriod=2 Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.637898 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.797013 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98qbh\" (UniqueName: \"kubernetes.io/projected/f4b81a6f-7626-4d1e-8781-acace6cc4da9-kube-api-access-98qbh\") pod \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.797066 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-utilities\") pod \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.797091 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-catalog-content\") pod \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\" (UID: \"f4b81a6f-7626-4d1e-8781-acace6cc4da9\") " Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.804456 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-utilities" (OuterVolumeSpecName: "utilities") pod "f4b81a6f-7626-4d1e-8781-acace6cc4da9" (UID: "f4b81a6f-7626-4d1e-8781-acace6cc4da9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.808988 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4b81a6f-7626-4d1e-8781-acace6cc4da9" (UID: "f4b81a6f-7626-4d1e-8781-acace6cc4da9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.818123 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b81a6f-7626-4d1e-8781-acace6cc4da9-kube-api-access-98qbh" (OuterVolumeSpecName: "kube-api-access-98qbh") pod "f4b81a6f-7626-4d1e-8781-acace6cc4da9" (UID: "f4b81a6f-7626-4d1e-8781-acace6cc4da9"). InnerVolumeSpecName "kube-api-access-98qbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.898907 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98qbh\" (UniqueName: \"kubernetes.io/projected/f4b81a6f-7626-4d1e-8781-acace6cc4da9-kube-api-access-98qbh\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.898948 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:20 crc kubenswrapper[4822]: I1010 06:37:20.898961 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b81a6f-7626-4d1e-8781-acace6cc4da9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.192543 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerID="a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8" exitCode=0 Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.192590 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dr568" event={"ID":"f4b81a6f-7626-4d1e-8781-acace6cc4da9","Type":"ContainerDied","Data":"a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8"} Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.192615 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dr568" event={"ID":"f4b81a6f-7626-4d1e-8781-acace6cc4da9","Type":"ContainerDied","Data":"e8f305f25f3b1dcbd384c796ef23b17de5d7601b8ba046b9270142ee1af9f760"} Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.192633 4822 scope.go:117] "RemoveContainer" containerID="a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.192659 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dr568" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.217368 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dr568"] Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.217940 4822 scope.go:117] "RemoveContainer" containerID="d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.222225 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dr568"] Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.235874 4822 scope.go:117] "RemoveContainer" containerID="b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.257730 4822 scope.go:117] "RemoveContainer" containerID="a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8" Oct 10 06:37:21 crc kubenswrapper[4822]: E1010 06:37:21.258152 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8\": container with ID starting with a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8 not found: ID does not exist" containerID="a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.258256 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8"} err="failed to get container status \"a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8\": rpc error: code = NotFound desc = could not find container \"a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8\": container with ID starting with a3dfa208e0ebd5f0eee8f108acc75c83bea3d14c501dc3c12c88f40e4ceb27f8 not found: ID does not exist" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.258342 4822 scope.go:117] "RemoveContainer" containerID="d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5" Oct 10 06:37:21 crc kubenswrapper[4822]: E1010 06:37:21.258615 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5\": container with ID starting with d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5 not found: ID does not exist" containerID="d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.258637 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5"} err="failed to get container status \"d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5\": rpc error: code = NotFound desc = could not find container \"d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5\": container with ID starting with d87dc3e55ab3afdecf12b8499d0eb7e15dbcfe466db5a20e14fa4b6bce1f91c5 not found: ID does not exist" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.258650 4822 scope.go:117] "RemoveContainer" containerID="b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44" Oct 10 06:37:21 crc kubenswrapper[4822]: E1010 06:37:21.258860 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44\": container with ID starting with b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44 not found: ID does not exist" containerID="b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.258880 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44"} err="failed to get container status \"b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44\": rpc error: code = NotFound desc = could not find container \"b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44\": container with ID starting with b7b127828394cc10a016dbf778588296642f70fd1711b9890710b4b551927d44 not found: ID does not exist" Oct 10 06:37:21 crc kubenswrapper[4822]: I1010 06:37:21.662545 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" path="/var/lib/kubelet/pods/f4b81a6f-7626-4d1e-8781-acace6cc4da9/volumes" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.157628 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-kvjlx" podUID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" containerName="console" containerID="cri-o://69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172" gracePeriod=15 Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.633311 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kvjlx_c0649cc6-9ef6-4ecb-9a0e-fac537a3f208/console/0.log" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.633383 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.755731 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfp7s\" (UniqueName: \"kubernetes.io/projected/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-kube-api-access-tfp7s\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.755784 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-oauth-serving-cert\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.755858 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-trusted-ca-bundle\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.755896 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-oauth-config\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.755917 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-service-ca\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.755965 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-config\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.756082 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-serving-cert\") pod \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\" (UID: \"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208\") " Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.756730 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-service-ca" (OuterVolumeSpecName: "service-ca") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.756746 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.756769 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.756786 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-config" (OuterVolumeSpecName: "console-config") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.758161 4822 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.758197 4822 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.758215 4822 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.758231 4822 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.762075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.762188 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-kube-api-access-tfp7s" (OuterVolumeSpecName: "kube-api-access-tfp7s") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "kube-api-access-tfp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.762328 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" (UID: "c0649cc6-9ef6-4ecb-9a0e-fac537a3f208"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.859847 4822 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.859879 4822 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:24 crc kubenswrapper[4822]: I1010 06:37:24.859890 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfp7s\" (UniqueName: \"kubernetes.io/projected/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208-kube-api-access-tfp7s\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.221785 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kvjlx_c0649cc6-9ef6-4ecb-9a0e-fac537a3f208/console/0.log" Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.221895 4822 generic.go:334] "Generic (PLEG): container finished" podID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" containerID="69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172" exitCode=2 Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.221938 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kvjlx" event={"ID":"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208","Type":"ContainerDied","Data":"69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172"} Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.221975 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kvjlx" Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.222005 4822 scope.go:117] "RemoveContainer" containerID="69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172" Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.221976 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kvjlx" event={"ID":"c0649cc6-9ef6-4ecb-9a0e-fac537a3f208","Type":"ContainerDied","Data":"1a84fff145105e53338569f1d6cc247e233d09f215b39f5a537f99e7b3aa0ef5"} Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.246509 4822 scope.go:117] "RemoveContainer" containerID="69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172" Oct 10 06:37:25 crc kubenswrapper[4822]: E1010 06:37:25.248693 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172\": container with ID starting with 69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172 not found: ID does not exist" containerID="69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172" Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.248748 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172"} err="failed to get container status \"69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172\": rpc error: code = NotFound desc = could not find container \"69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172\": container with ID starting with 69bc8a47e6e1a81fd8c3dbc1a489797b28c8d0e45e6e8789c0cef712fd568172 not found: ID does not exist" Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.259617 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kvjlx"] Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.263289 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-kvjlx"] Oct 10 06:37:25 crc kubenswrapper[4822]: I1010 06:37:25.657848 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" path="/var/lib/kubelet/pods/c0649cc6-9ef6-4ecb-9a0e-fac537a3f208/volumes" Oct 10 06:37:26 crc kubenswrapper[4822]: I1010 06:37:26.461399 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:26 crc kubenswrapper[4822]: I1010 06:37:26.461448 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:26 crc kubenswrapper[4822]: I1010 06:37:26.530520 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.295312 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.552319 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp"] Oct 10 06:37:27 crc kubenswrapper[4822]: E1010 06:37:27.552733 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" containerName="console" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.552763 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" containerName="console" Oct 10 06:37:27 crc kubenswrapper[4822]: E1010 06:37:27.552785 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="extract-content" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.552836 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="extract-content" Oct 10 06:37:27 crc kubenswrapper[4822]: E1010 06:37:27.552885 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="registry-server" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.552908 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="registry-server" Oct 10 06:37:27 crc kubenswrapper[4822]: E1010 06:37:27.552937 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="extract-utilities" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.552954 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="extract-utilities" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.553178 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0649cc6-9ef6-4ecb-9a0e-fac537a3f208" containerName="console" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.553239 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b81a6f-7626-4d1e-8781-acace6cc4da9" containerName="registry-server" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.555027 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.558362 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.562415 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp"] Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.698518 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.698667 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.698885 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kmz\" (UniqueName: \"kubernetes.io/projected/eaf4d641-b224-4693-b0ad-b9dd73bd0681-kube-api-access-l2kmz\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.711993 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m7fgw"] Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.713937 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.719976 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7fgw"] Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.799744 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kmz\" (UniqueName: \"kubernetes.io/projected/eaf4d641-b224-4693-b0ad-b9dd73bd0681-kube-api-access-l2kmz\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.799854 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.799955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.800458 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.800462 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.827007 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kmz\" (UniqueName: \"kubernetes.io/projected/eaf4d641-b224-4693-b0ad-b9dd73bd0681-kube-api-access-l2kmz\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.889175 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.900795 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pmh\" (UniqueName: \"kubernetes.io/projected/9daa5efb-5a21-442a-b635-4e0026502d93-kube-api-access-m7pmh\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.900874 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-catalog-content\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:27 crc kubenswrapper[4822]: I1010 06:37:27.901159 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-utilities\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.002096 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-utilities\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.002149 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pmh\" (UniqueName: \"kubernetes.io/projected/9daa5efb-5a21-442a-b635-4e0026502d93-kube-api-access-m7pmh\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.002172 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-catalog-content\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.002629 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-catalog-content\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.002675 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-utilities\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.029519 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pmh\" (UniqueName: \"kubernetes.io/projected/9daa5efb-5a21-442a-b635-4e0026502d93-kube-api-access-m7pmh\") pod \"community-operators-m7fgw\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.049446 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.311355 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp"] Oct 10 06:37:28 crc kubenswrapper[4822]: I1010 06:37:28.345558 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7fgw"] Oct 10 06:37:29 crc kubenswrapper[4822]: I1010 06:37:29.267376 4822 generic.go:334] "Generic (PLEG): container finished" podID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerID="6306995d75816844e245a2eb7f184b41937fd25c5f97f521e9f2b795219ebdae" exitCode=0 Oct 10 06:37:29 crc kubenswrapper[4822]: I1010 06:37:29.267478 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" event={"ID":"eaf4d641-b224-4693-b0ad-b9dd73bd0681","Type":"ContainerDied","Data":"6306995d75816844e245a2eb7f184b41937fd25c5f97f521e9f2b795219ebdae"} Oct 10 06:37:29 crc kubenswrapper[4822]: I1010 06:37:29.267517 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" event={"ID":"eaf4d641-b224-4693-b0ad-b9dd73bd0681","Type":"ContainerStarted","Data":"f71a10244d62518846e23438182eb0f447d92a58b5bbbf034ef6e19ff77e1712"} Oct 10 06:37:29 crc kubenswrapper[4822]: I1010 06:37:29.269378 4822 generic.go:334] "Generic (PLEG): container finished" podID="9daa5efb-5a21-442a-b635-4e0026502d93" containerID="87e589826a4cf5cc8b615e6d5fb19fe1b01c2ce34464276d149988d544fb374d" exitCode=0 Oct 10 06:37:29 crc kubenswrapper[4822]: I1010 06:37:29.269444 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerDied","Data":"87e589826a4cf5cc8b615e6d5fb19fe1b01c2ce34464276d149988d544fb374d"} Oct 10 06:37:29 crc kubenswrapper[4822]: I1010 06:37:29.269471 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerStarted","Data":"15ba70c59a8187f45f44235936b02728197c40cff3d2112986ea2d38f6ac3aeb"} Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.276635 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerStarted","Data":"6af6a727580eaa0b60246bc9e849524b2055bbb01313054988929b27dc46980a"} Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.301418 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qr5wq"] Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.301731 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qr5wq" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="registry-server" containerID="cri-o://c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e" gracePeriod=2 Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.725663 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.843334 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-utilities\") pod \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.843401 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-catalog-content\") pod \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.843427 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9cbz\" (UniqueName: \"kubernetes.io/projected/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-kube-api-access-m9cbz\") pod \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\" (UID: \"8fb10804-5ba4-479a-b78b-3c9d1dec7feb\") " Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.845294 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-utilities" (OuterVolumeSpecName: "utilities") pod "8fb10804-5ba4-479a-b78b-3c9d1dec7feb" (UID: "8fb10804-5ba4-479a-b78b-3c9d1dec7feb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.853509 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-kube-api-access-m9cbz" (OuterVolumeSpecName: "kube-api-access-m9cbz") pod "8fb10804-5ba4-479a-b78b-3c9d1dec7feb" (UID: "8fb10804-5ba4-479a-b78b-3c9d1dec7feb"). InnerVolumeSpecName "kube-api-access-m9cbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.923962 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fb10804-5ba4-479a-b78b-3c9d1dec7feb" (UID: "8fb10804-5ba4-479a-b78b-3c9d1dec7feb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.945099 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.945172 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:30 crc kubenswrapper[4822]: I1010 06:37:30.945188 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9cbz\" (UniqueName: \"kubernetes.io/projected/8fb10804-5ba4-479a-b78b-3c9d1dec7feb-kube-api-access-m9cbz\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.287174 4822 generic.go:334] "Generic (PLEG): container finished" podID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerID="c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e" exitCode=0 Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.287280 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qr5wq" event={"ID":"8fb10804-5ba4-479a-b78b-3c9d1dec7feb","Type":"ContainerDied","Data":"c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e"} Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.287289 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qr5wq" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.287321 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qr5wq" event={"ID":"8fb10804-5ba4-479a-b78b-3c9d1dec7feb","Type":"ContainerDied","Data":"25e748e0d39403315aaf41659d03753070cbf3d96770ff25de2f9e479e4f9052"} Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.287352 4822 scope.go:117] "RemoveContainer" containerID="c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.290953 4822 generic.go:334] "Generic (PLEG): container finished" podID="9daa5efb-5a21-442a-b635-4e0026502d93" containerID="6af6a727580eaa0b60246bc9e849524b2055bbb01313054988929b27dc46980a" exitCode=0 Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.290998 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerDied","Data":"6af6a727580eaa0b60246bc9e849524b2055bbb01313054988929b27dc46980a"} Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.309210 4822 scope.go:117] "RemoveContainer" containerID="020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.328153 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qr5wq"] Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.331169 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qr5wq"] Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.347458 4822 scope.go:117] "RemoveContainer" containerID="c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.366496 4822 scope.go:117] "RemoveContainer" containerID="c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e" Oct 10 06:37:31 crc kubenswrapper[4822]: E1010 06:37:31.367247 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e\": container with ID starting with c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e not found: ID does not exist" containerID="c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.367283 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e"} err="failed to get container status \"c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e\": rpc error: code = NotFound desc = could not find container \"c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e\": container with ID starting with c3a3eb20af2971d3b7d7a9a1e66382092820d72015a9b1ffc43f44375719560e not found: ID does not exist" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.367307 4822 scope.go:117] "RemoveContainer" containerID="020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6" Oct 10 06:37:31 crc kubenswrapper[4822]: E1010 06:37:31.367674 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6\": container with ID starting with 020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6 not found: ID does not exist" containerID="020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.367695 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6"} err="failed to get container status \"020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6\": rpc error: code = NotFound desc = could not find container \"020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6\": container with ID starting with 020e34b906c475511f930ce5789f35db219724cb71862095caadeea0eac7ace6 not found: ID does not exist" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.367712 4822 scope.go:117] "RemoveContainer" containerID="c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30" Oct 10 06:37:31 crc kubenswrapper[4822]: E1010 06:37:31.367998 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30\": container with ID starting with c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30 not found: ID does not exist" containerID="c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.368028 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30"} err="failed to get container status \"c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30\": rpc error: code = NotFound desc = could not find container \"c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30\": container with ID starting with c1e0b8396f1624951b1ad4245971e556741f4005d50fe31ee61b78b57810cd30 not found: ID does not exist" Oct 10 06:37:31 crc kubenswrapper[4822]: I1010 06:37:31.662871 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" path="/var/lib/kubelet/pods/8fb10804-5ba4-479a-b78b-3c9d1dec7feb/volumes" Oct 10 06:37:32 crc kubenswrapper[4822]: I1010 06:37:32.304172 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerStarted","Data":"6e51169bf03e2d438aa7241e7da0e0834c176da3ad54b78f80a16a2565616730"} Oct 10 06:37:32 crc kubenswrapper[4822]: I1010 06:37:32.332469 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m7fgw" podStartSLOduration=2.840026529 podStartE2EDuration="5.332446837s" podCreationTimestamp="2025-10-10 06:37:27 +0000 UTC" firstStartedPulling="2025-10-10 06:37:29.271266652 +0000 UTC m=+796.366424858" lastFinishedPulling="2025-10-10 06:37:31.76368696 +0000 UTC m=+798.858845166" observedRunningTime="2025-10-10 06:37:32.331087547 +0000 UTC m=+799.426245753" watchObservedRunningTime="2025-10-10 06:37:32.332446837 +0000 UTC m=+799.427605043" Oct 10 06:37:35 crc kubenswrapper[4822]: I1010 06:37:35.327910 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" event={"ID":"eaf4d641-b224-4693-b0ad-b9dd73bd0681","Type":"ContainerStarted","Data":"410e345a900c5692110c1288c0574878b3a17bf628ecbef59165b982de22aed7"} Oct 10 06:37:36 crc kubenswrapper[4822]: I1010 06:37:36.337531 4822 generic.go:334] "Generic (PLEG): container finished" podID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerID="410e345a900c5692110c1288c0574878b3a17bf628ecbef59165b982de22aed7" exitCode=0 Oct 10 06:37:36 crc kubenswrapper[4822]: I1010 06:37:36.337582 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" event={"ID":"eaf4d641-b224-4693-b0ad-b9dd73bd0681","Type":"ContainerDied","Data":"410e345a900c5692110c1288c0574878b3a17bf628ecbef59165b982de22aed7"} Oct 10 06:37:37 crc kubenswrapper[4822]: I1010 06:37:37.360765 4822 generic.go:334] "Generic (PLEG): container finished" podID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerID="43cf9703482023b7a57351841152d7708e7d9cf358c1d602bfd32b6b4eb39b60" exitCode=0 Oct 10 06:37:37 crc kubenswrapper[4822]: I1010 06:37:37.360834 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" event={"ID":"eaf4d641-b224-4693-b0ad-b9dd73bd0681","Type":"ContainerDied","Data":"43cf9703482023b7a57351841152d7708e7d9cf358c1d602bfd32b6b4eb39b60"} Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.051093 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.051173 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.122777 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.417081 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.652214 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.775825 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kmz\" (UniqueName: \"kubernetes.io/projected/eaf4d641-b224-4693-b0ad-b9dd73bd0681-kube-api-access-l2kmz\") pod \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.775884 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-util\") pod \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.775936 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-bundle\") pod \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\" (UID: \"eaf4d641-b224-4693-b0ad-b9dd73bd0681\") " Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.776991 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-bundle" (OuterVolumeSpecName: "bundle") pod "eaf4d641-b224-4693-b0ad-b9dd73bd0681" (UID: "eaf4d641-b224-4693-b0ad-b9dd73bd0681"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.783012 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf4d641-b224-4693-b0ad-b9dd73bd0681-kube-api-access-l2kmz" (OuterVolumeSpecName: "kube-api-access-l2kmz") pod "eaf4d641-b224-4693-b0ad-b9dd73bd0681" (UID: "eaf4d641-b224-4693-b0ad-b9dd73bd0681"). InnerVolumeSpecName "kube-api-access-l2kmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.797489 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-util" (OuterVolumeSpecName: "util") pod "eaf4d641-b224-4693-b0ad-b9dd73bd0681" (UID: "eaf4d641-b224-4693-b0ad-b9dd73bd0681"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.877131 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2kmz\" (UniqueName: \"kubernetes.io/projected/eaf4d641-b224-4693-b0ad-b9dd73bd0681-kube-api-access-l2kmz\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.877183 4822 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-util\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:38 crc kubenswrapper[4822]: I1010 06:37:38.877202 4822 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d641-b224-4693-b0ad-b9dd73bd0681-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:39 crc kubenswrapper[4822]: I1010 06:37:39.377663 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" event={"ID":"eaf4d641-b224-4693-b0ad-b9dd73bd0681","Type":"ContainerDied","Data":"f71a10244d62518846e23438182eb0f447d92a58b5bbbf034ef6e19ff77e1712"} Oct 10 06:37:39 crc kubenswrapper[4822]: I1010 06:37:39.377717 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp" Oct 10 06:37:39 crc kubenswrapper[4822]: I1010 06:37:39.377734 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71a10244d62518846e23438182eb0f447d92a58b5bbbf034ef6e19ff77e1712" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.094509 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7fgw"] Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.095495 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m7fgw" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="registry-server" containerID="cri-o://6e51169bf03e2d438aa7241e7da0e0834c176da3ad54b78f80a16a2565616730" gracePeriod=2 Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.402534 4822 generic.go:334] "Generic (PLEG): container finished" podID="9daa5efb-5a21-442a-b635-4e0026502d93" containerID="6e51169bf03e2d438aa7241e7da0e0834c176da3ad54b78f80a16a2565616730" exitCode=0 Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.402867 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerDied","Data":"6e51169bf03e2d438aa7241e7da0e0834c176da3ad54b78f80a16a2565616730"} Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.512858 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.544011 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-utilities\") pod \"9daa5efb-5a21-442a-b635-4e0026502d93\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.544377 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-catalog-content\") pod \"9daa5efb-5a21-442a-b635-4e0026502d93\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.544524 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7pmh\" (UniqueName: \"kubernetes.io/projected/9daa5efb-5a21-442a-b635-4e0026502d93-kube-api-access-m7pmh\") pod \"9daa5efb-5a21-442a-b635-4e0026502d93\" (UID: \"9daa5efb-5a21-442a-b635-4e0026502d93\") " Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.545250 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-utilities" (OuterVolumeSpecName: "utilities") pod "9daa5efb-5a21-442a-b635-4e0026502d93" (UID: "9daa5efb-5a21-442a-b635-4e0026502d93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.552010 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daa5efb-5a21-442a-b635-4e0026502d93-kube-api-access-m7pmh" (OuterVolumeSpecName: "kube-api-access-m7pmh") pod "9daa5efb-5a21-442a-b635-4e0026502d93" (UID: "9daa5efb-5a21-442a-b635-4e0026502d93"). InnerVolumeSpecName "kube-api-access-m7pmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.619734 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9daa5efb-5a21-442a-b635-4e0026502d93" (UID: "9daa5efb-5a21-442a-b635-4e0026502d93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.646092 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.646133 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa5efb-5a21-442a-b635-4e0026502d93-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:43 crc kubenswrapper[4822]: I1010 06:37:43.646147 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7pmh\" (UniqueName: \"kubernetes.io/projected/9daa5efb-5a21-442a-b635-4e0026502d93-kube-api-access-m7pmh\") on node \"crc\" DevicePath \"\"" Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.414120 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fgw" event={"ID":"9daa5efb-5a21-442a-b635-4e0026502d93","Type":"ContainerDied","Data":"15ba70c59a8187f45f44235936b02728197c40cff3d2112986ea2d38f6ac3aeb"} Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.414494 4822 scope.go:117] "RemoveContainer" containerID="6e51169bf03e2d438aa7241e7da0e0834c176da3ad54b78f80a16a2565616730" Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.414201 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fgw" Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.435926 4822 scope.go:117] "RemoveContainer" containerID="6af6a727580eaa0b60246bc9e849524b2055bbb01313054988929b27dc46980a" Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.442883 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7fgw"] Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.454504 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m7fgw"] Oct 10 06:37:44 crc kubenswrapper[4822]: I1010 06:37:44.462048 4822 scope.go:117] "RemoveContainer" containerID="87e589826a4cf5cc8b615e6d5fb19fe1b01c2ce34464276d149988d544fb374d" Oct 10 06:37:45 crc kubenswrapper[4822]: I1010 06:37:45.658282 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" path="/var/lib/kubelet/pods/9daa5efb-5a21-442a-b635-4e0026502d93/volumes" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.583501 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7"] Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.583942 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="extract-utilities" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.583956 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="extract-utilities" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.583963 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="util" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.583971 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="util" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.583978 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="registry-server" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.583984 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="registry-server" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.583992 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="pull" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.583998 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="pull" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.584010 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="extract-content" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584017 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="extract-content" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.584026 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="extract-utilities" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584032 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="extract-utilities" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.584041 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="extract-content" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584046 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="extract-content" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.584053 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="extract" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584058 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="extract" Oct 10 06:37:48 crc kubenswrapper[4822]: E1010 06:37:48.584066 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="registry-server" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584072 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="registry-server" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584170 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb10804-5ba4-479a-b78b-3c9d1dec7feb" containerName="registry-server" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584184 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9daa5efb-5a21-442a-b635-4e0026502d93" containerName="registry-server" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584195 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf4d641-b224-4693-b0ad-b9dd73bd0681" containerName="extract" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.584566 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.586557 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kv6js" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.586557 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.586554 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.586902 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.591258 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.607009 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7"] Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.607583 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3981699-53fd-4702-b1b4-e5a948937551-webhook-cert\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.607639 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/f3981699-53fd-4702-b1b4-e5a948937551-kube-api-access-t442m\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.607679 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3981699-53fd-4702-b1b4-e5a948937551-apiservice-cert\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.709069 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3981699-53fd-4702-b1b4-e5a948937551-webhook-cert\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.709117 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/f3981699-53fd-4702-b1b4-e5a948937551-kube-api-access-t442m\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.709415 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3981699-53fd-4702-b1b4-e5a948937551-apiservice-cert\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.717754 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3981699-53fd-4702-b1b4-e5a948937551-webhook-cert\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.719321 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3981699-53fd-4702-b1b4-e5a948937551-apiservice-cert\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.727352 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/f3981699-53fd-4702-b1b4-e5a948937551-kube-api-access-t442m\") pod \"metallb-operator-controller-manager-746cb4bdc6-l6fk7\" (UID: \"f3981699-53fd-4702-b1b4-e5a948937551\") " pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.842747 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-b74789bd-zcp27"] Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.843968 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.848614 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.848762 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4r7lx" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.849490 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.863976 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b74789bd-zcp27"] Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.898587 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.926383 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ea5d77d-f95c-469d-8dcd-f02605187e89-webhook-cert\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.926444 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvrfm\" (UniqueName: \"kubernetes.io/projected/5ea5d77d-f95c-469d-8dcd-f02605187e89-kube-api-access-pvrfm\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:48 crc kubenswrapper[4822]: I1010 06:37:48.926504 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ea5d77d-f95c-469d-8dcd-f02605187e89-apiservice-cert\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.028300 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ea5d77d-f95c-469d-8dcd-f02605187e89-apiservice-cert\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.028375 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ea5d77d-f95c-469d-8dcd-f02605187e89-webhook-cert\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.028440 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvrfm\" (UniqueName: \"kubernetes.io/projected/5ea5d77d-f95c-469d-8dcd-f02605187e89-kube-api-access-pvrfm\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.038849 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ea5d77d-f95c-469d-8dcd-f02605187e89-apiservice-cert\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.038903 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ea5d77d-f95c-469d-8dcd-f02605187e89-webhook-cert\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.059362 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvrfm\" (UniqueName: \"kubernetes.io/projected/5ea5d77d-f95c-469d-8dcd-f02605187e89-kube-api-access-pvrfm\") pod \"metallb-operator-webhook-server-b74789bd-zcp27\" (UID: \"5ea5d77d-f95c-469d-8dcd-f02605187e89\") " pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.105459 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7"] Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.159222 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.441772 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" event={"ID":"f3981699-53fd-4702-b1b4-e5a948937551","Type":"ContainerStarted","Data":"8ad7e4f73f451afce81e361cb9479bc65f52ab8d84a38bab4fc83650d8942490"} Oct 10 06:37:49 crc kubenswrapper[4822]: I1010 06:37:49.619590 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b74789bd-zcp27"] Oct 10 06:37:49 crc kubenswrapper[4822]: W1010 06:37:49.626679 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ea5d77d_f95c_469d_8dcd_f02605187e89.slice/crio-0b244f9382c3b3506af95101a0c4d7dcc1823c0ef4af043c37a609d9ea55b584 WatchSource:0}: Error finding container 0b244f9382c3b3506af95101a0c4d7dcc1823c0ef4af043c37a609d9ea55b584: Status 404 returned error can't find the container with id 0b244f9382c3b3506af95101a0c4d7dcc1823c0ef4af043c37a609d9ea55b584 Oct 10 06:37:50 crc kubenswrapper[4822]: I1010 06:37:50.451028 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" event={"ID":"5ea5d77d-f95c-469d-8dcd-f02605187e89","Type":"ContainerStarted","Data":"0b244f9382c3b3506af95101a0c4d7dcc1823c0ef4af043c37a609d9ea55b584"} Oct 10 06:37:52 crc kubenswrapper[4822]: I1010 06:37:52.464788 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" event={"ID":"f3981699-53fd-4702-b1b4-e5a948937551","Type":"ContainerStarted","Data":"855c38462c26733563be8fee9d893b27a68ff9473decfab0ee7c6369a742cfc3"} Oct 10 06:37:52 crc kubenswrapper[4822]: I1010 06:37:52.465137 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:37:52 crc kubenswrapper[4822]: I1010 06:37:52.490742 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" podStartSLOduration=1.514537227 podStartE2EDuration="4.490724015s" podCreationTimestamp="2025-10-10 06:37:48 +0000 UTC" firstStartedPulling="2025-10-10 06:37:49.120763582 +0000 UTC m=+816.215921768" lastFinishedPulling="2025-10-10 06:37:52.09695032 +0000 UTC m=+819.192108556" observedRunningTime="2025-10-10 06:37:52.480070329 +0000 UTC m=+819.575228535" watchObservedRunningTime="2025-10-10 06:37:52.490724015 +0000 UTC m=+819.585882211" Oct 10 06:37:54 crc kubenswrapper[4822]: I1010 06:37:54.477426 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" event={"ID":"5ea5d77d-f95c-469d-8dcd-f02605187e89","Type":"ContainerStarted","Data":"a796e99f3f6c76801cecaf7b1d4045a3843de52a105612bd149e61e110141afb"} Oct 10 06:37:54 crc kubenswrapper[4822]: I1010 06:37:54.477771 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:37:54 crc kubenswrapper[4822]: I1010 06:37:54.498766 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" podStartSLOduration=1.9731756310000002 podStartE2EDuration="6.498745064s" podCreationTimestamp="2025-10-10 06:37:48 +0000 UTC" firstStartedPulling="2025-10-10 06:37:49.630732751 +0000 UTC m=+816.725890947" lastFinishedPulling="2025-10-10 06:37:54.156302184 +0000 UTC m=+821.251460380" observedRunningTime="2025-10-10 06:37:54.495991212 +0000 UTC m=+821.591149428" watchObservedRunningTime="2025-10-10 06:37:54.498745064 +0000 UTC m=+821.593903260" Oct 10 06:38:09 crc kubenswrapper[4822]: I1010 06:38:09.173446 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-b74789bd-zcp27" Oct 10 06:38:28 crc kubenswrapper[4822]: I1010 06:38:28.901554 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-746cb4bdc6-l6fk7" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.646736 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z"] Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.647416 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.650364 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.650747 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4djkc" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.667583 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xq8m5"] Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.690205 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z"] Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.690321 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.693214 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.693395 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.742699 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-sfhk5"] Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.743531 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-sfhk5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.745732 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.745734 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.745924 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.746062 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jtxhf" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.766866 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-nzzr4"] Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.768110 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.770629 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-nzzr4"] Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.777162 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812455 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4pfg\" (UniqueName: \"kubernetes.io/projected/ccf84349-3882-419d-8349-90a71b1a70cc-kube-api-access-n4pfg\") pod \"frr-k8s-webhook-server-64bf5d555-l5c7z\" (UID: \"ccf84349-3882-419d-8349-90a71b1a70cc\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812514 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/457e5c37-7370-4959-9199-3217ee9b5b26-frr-startup\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812536 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9c4j\" (UniqueName: \"kubernetes.io/projected/457e5c37-7370-4959-9199-3217ee9b5b26-kube-api-access-v9c4j\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812563 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-metrics\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812585 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-frr-sockets\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812627 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-frr-conf\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812644 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccf84349-3882-419d-8349-90a71b1a70cc-cert\") pod \"frr-k8s-webhook-server-64bf5d555-l5c7z\" (UID: \"ccf84349-3882-419d-8349-90a71b1a70cc\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812658 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/457e5c37-7370-4959-9199-3217ee9b5b26-metrics-certs\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.812694 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-reloader\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.914580 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-reloader\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.914684 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.914833 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4pfg\" (UniqueName: \"kubernetes.io/projected/ccf84349-3882-419d-8349-90a71b1a70cc-kube-api-access-n4pfg\") pod \"frr-k8s-webhook-server-64bf5d555-l5c7z\" (UID: \"ccf84349-3882-419d-8349-90a71b1a70cc\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.915058 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-reloader\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.915479 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-metrics-certs\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.915653 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/457e5c37-7370-4959-9199-3217ee9b5b26-frr-startup\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.915742 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9c4j\" (UniqueName: \"kubernetes.io/projected/457e5c37-7370-4959-9199-3217ee9b5b26-kube-api-access-v9c4j\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917091 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e280e687-d626-4b27-bcef-9257b81b8b12-metrics-certs\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917159 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/94de9084-9149-456e-9e20-9415eebcd145-metallb-excludel2\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917193 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwjh5\" (UniqueName: \"kubernetes.io/projected/e280e687-d626-4b27-bcef-9257b81b8b12-kube-api-access-gwjh5\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917334 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-metrics\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917379 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e280e687-d626-4b27-bcef-9257b81b8b12-cert\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917428 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-frr-sockets\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917463 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-frr-conf\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917510 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccf84349-3882-419d-8349-90a71b1a70cc-cert\") pod \"frr-k8s-webhook-server-64bf5d555-l5c7z\" (UID: \"ccf84349-3882-419d-8349-90a71b1a70cc\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917528 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/457e5c37-7370-4959-9199-3217ee9b5b26-frr-startup\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917537 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/457e5c37-7370-4959-9199-3217ee9b5b26-metrics-certs\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917608 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9447\" (UniqueName: \"kubernetes.io/projected/94de9084-9149-456e-9e20-9415eebcd145-kube-api-access-n9447\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917660 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-metrics\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.917933 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-frr-sockets\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.918152 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/457e5c37-7370-4959-9199-3217ee9b5b26-frr-conf\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.924010 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccf84349-3882-419d-8349-90a71b1a70cc-cert\") pod \"frr-k8s-webhook-server-64bf5d555-l5c7z\" (UID: \"ccf84349-3882-419d-8349-90a71b1a70cc\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.924310 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/457e5c37-7370-4959-9199-3217ee9b5b26-metrics-certs\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.931658 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9c4j\" (UniqueName: \"kubernetes.io/projected/457e5c37-7370-4959-9199-3217ee9b5b26-kube-api-access-v9c4j\") pod \"frr-k8s-xq8m5\" (UID: \"457e5c37-7370-4959-9199-3217ee9b5b26\") " pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.944837 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4pfg\" (UniqueName: \"kubernetes.io/projected/ccf84349-3882-419d-8349-90a71b1a70cc-kube-api-access-n4pfg\") pod \"frr-k8s-webhook-server-64bf5d555-l5c7z\" (UID: \"ccf84349-3882-419d-8349-90a71b1a70cc\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:29 crc kubenswrapper[4822]: I1010 06:38:29.982569 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.009255 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019157 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019217 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-metrics-certs\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019257 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e280e687-d626-4b27-bcef-9257b81b8b12-metrics-certs\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019282 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/94de9084-9149-456e-9e20-9415eebcd145-metallb-excludel2\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019309 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwjh5\" (UniqueName: \"kubernetes.io/projected/e280e687-d626-4b27-bcef-9257b81b8b12-kube-api-access-gwjh5\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019341 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e280e687-d626-4b27-bcef-9257b81b8b12-cert\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.019376 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9447\" (UniqueName: \"kubernetes.io/projected/94de9084-9149-456e-9e20-9415eebcd145-kube-api-access-n9447\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: E1010 06:38:30.019779 4822 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 10 06:38:30 crc kubenswrapper[4822]: E1010 06:38:30.019855 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist podName:94de9084-9149-456e-9e20-9415eebcd145 nodeName:}" failed. No retries permitted until 2025-10-10 06:38:30.519836394 +0000 UTC m=+857.614994590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist") pod "speaker-sfhk5" (UID: "94de9084-9149-456e-9e20-9415eebcd145") : secret "metallb-memberlist" not found Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.022614 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/94de9084-9149-456e-9e20-9415eebcd145-metallb-excludel2\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.023480 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e280e687-d626-4b27-bcef-9257b81b8b12-metrics-certs\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.025614 4822 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.031389 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-metrics-certs\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.035940 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e280e687-d626-4b27-bcef-9257b81b8b12-cert\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.037888 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9447\" (UniqueName: \"kubernetes.io/projected/94de9084-9149-456e-9e20-9415eebcd145-kube-api-access-n9447\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.040588 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwjh5\" (UniqueName: \"kubernetes.io/projected/e280e687-d626-4b27-bcef-9257b81b8b12-kube-api-access-gwjh5\") pod \"controller-68d546b9d8-nzzr4\" (UID: \"e280e687-d626-4b27-bcef-9257b81b8b12\") " pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.094177 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.380423 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z"] Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.502867 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-nzzr4"] Oct 10 06:38:30 crc kubenswrapper[4822]: W1010 06:38:30.507148 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode280e687_d626_4b27_bcef_9257b81b8b12.slice/crio-a9e39d55038a27f99a99f6946f693b69417c3d0a4f9df0d0423c24f6712795c6 WatchSource:0}: Error finding container a9e39d55038a27f99a99f6946f693b69417c3d0a4f9df0d0423c24f6712795c6: Status 404 returned error can't find the container with id a9e39d55038a27f99a99f6946f693b69417c3d0a4f9df0d0423c24f6712795c6 Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.526884 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:30 crc kubenswrapper[4822]: E1010 06:38:30.527040 4822 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 10 06:38:30 crc kubenswrapper[4822]: E1010 06:38:30.527110 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist podName:94de9084-9149-456e-9e20-9415eebcd145 nodeName:}" failed. No retries permitted until 2025-10-10 06:38:31.527088856 +0000 UTC m=+858.622247052 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist") pod "speaker-sfhk5" (UID: "94de9084-9149-456e-9e20-9415eebcd145") : secret "metallb-memberlist" not found Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.716326 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-nzzr4" event={"ID":"e280e687-d626-4b27-bcef-9257b81b8b12","Type":"ContainerStarted","Data":"0a317e3b3e2e99af6d5e67563393a126da85684f9d63d7f39d755500643d9c3e"} Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.716369 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-nzzr4" event={"ID":"e280e687-d626-4b27-bcef-9257b81b8b12","Type":"ContainerStarted","Data":"a9e39d55038a27f99a99f6946f693b69417c3d0a4f9df0d0423c24f6712795c6"} Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.717493 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" event={"ID":"ccf84349-3882-419d-8349-90a71b1a70cc","Type":"ContainerStarted","Data":"6af3a04667e0ec427be2a528e6826ac7af555b388588998b377be01a7b272290"} Oct 10 06:38:30 crc kubenswrapper[4822]: I1010 06:38:30.718573 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"dee898526a9edf27f14ec444b9781a96a767d3dcb58324c3a2cf30b7c9ded2a3"} Oct 10 06:38:31 crc kubenswrapper[4822]: I1010 06:38:31.543795 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:31 crc kubenswrapper[4822]: I1010 06:38:31.550087 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94de9084-9149-456e-9e20-9415eebcd145-memberlist\") pod \"speaker-sfhk5\" (UID: \"94de9084-9149-456e-9e20-9415eebcd145\") " pod="metallb-system/speaker-sfhk5" Oct 10 06:38:31 crc kubenswrapper[4822]: I1010 06:38:31.567842 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-sfhk5" Oct 10 06:38:31 crc kubenswrapper[4822]: W1010 06:38:31.595154 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94de9084_9149_456e_9e20_9415eebcd145.slice/crio-62ae8a91703c5b1dcb1639f4570e64748734bf4f5eb557421334a79721fb7b7a WatchSource:0}: Error finding container 62ae8a91703c5b1dcb1639f4570e64748734bf4f5eb557421334a79721fb7b7a: Status 404 returned error can't find the container with id 62ae8a91703c5b1dcb1639f4570e64748734bf4f5eb557421334a79721fb7b7a Oct 10 06:38:31 crc kubenswrapper[4822]: I1010 06:38:31.727065 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-nzzr4" event={"ID":"e280e687-d626-4b27-bcef-9257b81b8b12","Type":"ContainerStarted","Data":"7ac12f952477eedbd8c7fdeb9d30acb3eecd37db0f85ac64f28ffd3246744e66"} Oct 10 06:38:31 crc kubenswrapper[4822]: I1010 06:38:31.728319 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:31 crc kubenswrapper[4822]: I1010 06:38:31.729172 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-sfhk5" event={"ID":"94de9084-9149-456e-9e20-9415eebcd145","Type":"ContainerStarted","Data":"62ae8a91703c5b1dcb1639f4570e64748734bf4f5eb557421334a79721fb7b7a"} Oct 10 06:38:32 crc kubenswrapper[4822]: I1010 06:38:32.746383 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-sfhk5" event={"ID":"94de9084-9149-456e-9e20-9415eebcd145","Type":"ContainerStarted","Data":"b9d0c589614753a75df098c1520975b7e25c6bfe9439d420b33debebb8d995d8"} Oct 10 06:38:32 crc kubenswrapper[4822]: I1010 06:38:32.746740 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-sfhk5" event={"ID":"94de9084-9149-456e-9e20-9415eebcd145","Type":"ContainerStarted","Data":"f1e951d816fdf63234cd3e88387ab58adb724cf4b783396e37828de59c590999"} Oct 10 06:38:32 crc kubenswrapper[4822]: I1010 06:38:32.746762 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-sfhk5" Oct 10 06:38:32 crc kubenswrapper[4822]: I1010 06:38:32.781203 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-sfhk5" podStartSLOduration=3.781184567 podStartE2EDuration="3.781184567s" podCreationTimestamp="2025-10-10 06:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:38:32.766536069 +0000 UTC m=+859.861694325" watchObservedRunningTime="2025-10-10 06:38:32.781184567 +0000 UTC m=+859.876342763" Oct 10 06:38:32 crc kubenswrapper[4822]: I1010 06:38:32.782384 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-nzzr4" podStartSLOduration=3.782375852 podStartE2EDuration="3.782375852s" podCreationTimestamp="2025-10-10 06:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:38:31.767655101 +0000 UTC m=+858.862813297" watchObservedRunningTime="2025-10-10 06:38:32.782375852 +0000 UTC m=+859.877534048" Oct 10 06:38:37 crc kubenswrapper[4822]: I1010 06:38:37.774204 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" event={"ID":"ccf84349-3882-419d-8349-90a71b1a70cc","Type":"ContainerStarted","Data":"ac9eb7ac566fe2e844a2f747e2ab082be5e1ccc97c75635381b3598499ddf449"} Oct 10 06:38:37 crc kubenswrapper[4822]: I1010 06:38:37.774767 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:37 crc kubenswrapper[4822]: I1010 06:38:37.775934 4822 generic.go:334] "Generic (PLEG): container finished" podID="457e5c37-7370-4959-9199-3217ee9b5b26" containerID="061019293fb37b403e90a4c03097bbbbdc33450f6c0bdab4b83c473161b4dfd2" exitCode=0 Oct 10 06:38:37 crc kubenswrapper[4822]: I1010 06:38:37.775970 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerDied","Data":"061019293fb37b403e90a4c03097bbbbdc33450f6c0bdab4b83c473161b4dfd2"} Oct 10 06:38:37 crc kubenswrapper[4822]: I1010 06:38:37.796376 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" podStartSLOduration=2.3037314540000002 podStartE2EDuration="8.796348171s" podCreationTimestamp="2025-10-10 06:38:29 +0000 UTC" firstStartedPulling="2025-10-10 06:38:30.38714646 +0000 UTC m=+857.482304656" lastFinishedPulling="2025-10-10 06:38:36.879763137 +0000 UTC m=+863.974921373" observedRunningTime="2025-10-10 06:38:37.789321026 +0000 UTC m=+864.884479232" watchObservedRunningTime="2025-10-10 06:38:37.796348171 +0000 UTC m=+864.891506407" Oct 10 06:38:38 crc kubenswrapper[4822]: I1010 06:38:38.782699 4822 generic.go:334] "Generic (PLEG): container finished" podID="457e5c37-7370-4959-9199-3217ee9b5b26" containerID="8c860912862e58292de2b1da8a6cffce5f94f8fa45958c73a9d219ba57bd5aef" exitCode=0 Oct 10 06:38:38 crc kubenswrapper[4822]: I1010 06:38:38.782927 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerDied","Data":"8c860912862e58292de2b1da8a6cffce5f94f8fa45958c73a9d219ba57bd5aef"} Oct 10 06:38:39 crc kubenswrapper[4822]: I1010 06:38:39.791687 4822 generic.go:334] "Generic (PLEG): container finished" podID="457e5c37-7370-4959-9199-3217ee9b5b26" containerID="2ce141be5d25527f07b2f0f49dc798396049900793008e14e3a6ecdea6beeacc" exitCode=0 Oct 10 06:38:39 crc kubenswrapper[4822]: I1010 06:38:39.791736 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerDied","Data":"2ce141be5d25527f07b2f0f49dc798396049900793008e14e3a6ecdea6beeacc"} Oct 10 06:38:40 crc kubenswrapper[4822]: I1010 06:38:40.101321 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-nzzr4" Oct 10 06:38:40 crc kubenswrapper[4822]: I1010 06:38:40.806518 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"2f82413dbb5af70900cc610619306af7b42c04635b88faac38ab550a9ea65100"} Oct 10 06:38:40 crc kubenswrapper[4822]: I1010 06:38:40.806901 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"f2e090ecbd54f487021a4bdfc963a5c286027b6ed5d4a255423664eeb4c9d227"} Oct 10 06:38:40 crc kubenswrapper[4822]: I1010 06:38:40.806914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"101b1f8068781b481294a6e5b37262b5184736b3367cf6da77bee1c0a78961bb"} Oct 10 06:38:40 crc kubenswrapper[4822]: I1010 06:38:40.806925 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"fa495674e8923a77b609c29eb28c73aba67a56f544cc56de948e5e83fc007df9"} Oct 10 06:38:40 crc kubenswrapper[4822]: I1010 06:38:40.806935 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"597b97886de5ef9d5e40cad17140e9f4eb69e92eca27aedea76f30b25cac3e5d"} Oct 10 06:38:41 crc kubenswrapper[4822]: I1010 06:38:41.572352 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-sfhk5" Oct 10 06:38:41 crc kubenswrapper[4822]: I1010 06:38:41.818322 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xq8m5" event={"ID":"457e5c37-7370-4959-9199-3217ee9b5b26","Type":"ContainerStarted","Data":"ba76828f5e812e0adf2bd6cb0e065ddf2eaa138e9513b7342315de5d51f464bf"} Oct 10 06:38:41 crc kubenswrapper[4822]: I1010 06:38:41.818725 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:41 crc kubenswrapper[4822]: I1010 06:38:41.854397 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xq8m5" podStartSLOduration=6.178737516 podStartE2EDuration="12.854369607s" podCreationTimestamp="2025-10-10 06:38:29 +0000 UTC" firstStartedPulling="2025-10-10 06:38:30.186678086 +0000 UTC m=+857.281836282" lastFinishedPulling="2025-10-10 06:38:36.862310167 +0000 UTC m=+863.957468373" observedRunningTime="2025-10-10 06:38:41.841621774 +0000 UTC m=+868.936780030" watchObservedRunningTime="2025-10-10 06:38:41.854369607 +0000 UTC m=+868.949527833" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.407594 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2"] Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.409252 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.411268 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.416305 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2"] Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.523717 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.523813 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.523887 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsts\" (UniqueName: \"kubernetes.io/projected/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-kube-api-access-rtsts\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.625631 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.625723 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsts\" (UniqueName: \"kubernetes.io/projected/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-kube-api-access-rtsts\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.625756 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.626191 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.626251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.645329 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsts\" (UniqueName: \"kubernetes.io/projected/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-kube-api-access-rtsts\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:43 crc kubenswrapper[4822]: I1010 06:38:43.730241 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:44 crc kubenswrapper[4822]: W1010 06:38:44.184180 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68f3d31_d2b5_469f_bc4e_b6b89fe95cc9.slice/crio-ddd6870071ef3b8aaadb5d5c11484cd44302e20d9d053c729e6df21da60b3ef1 WatchSource:0}: Error finding container ddd6870071ef3b8aaadb5d5c11484cd44302e20d9d053c729e6df21da60b3ef1: Status 404 returned error can't find the container with id ddd6870071ef3b8aaadb5d5c11484cd44302e20d9d053c729e6df21da60b3ef1 Oct 10 06:38:44 crc kubenswrapper[4822]: I1010 06:38:44.190485 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2"] Oct 10 06:38:44 crc kubenswrapper[4822]: I1010 06:38:44.839570 4822 generic.go:334] "Generic (PLEG): container finished" podID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerID="89269994d826c627eaa850dd47a8c1a7f49d1198a0519e6656ae23f2c267b909" exitCode=0 Oct 10 06:38:44 crc kubenswrapper[4822]: I1010 06:38:44.839707 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" event={"ID":"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9","Type":"ContainerDied","Data":"89269994d826c627eaa850dd47a8c1a7f49d1198a0519e6656ae23f2c267b909"} Oct 10 06:38:44 crc kubenswrapper[4822]: I1010 06:38:44.839789 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" event={"ID":"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9","Type":"ContainerStarted","Data":"ddd6870071ef3b8aaadb5d5c11484cd44302e20d9d053c729e6df21da60b3ef1"} Oct 10 06:38:45 crc kubenswrapper[4822]: I1010 06:38:45.010381 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:45 crc kubenswrapper[4822]: I1010 06:38:45.048830 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:47 crc kubenswrapper[4822]: I1010 06:38:47.860185 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" event={"ID":"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9","Type":"ContainerStarted","Data":"4570c767afe4694bc9378deaa436b25709dc62e094406b06dc73ff3003807c8a"} Oct 10 06:38:48 crc kubenswrapper[4822]: I1010 06:38:48.867947 4822 generic.go:334] "Generic (PLEG): container finished" podID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerID="4570c767afe4694bc9378deaa436b25709dc62e094406b06dc73ff3003807c8a" exitCode=0 Oct 10 06:38:48 crc kubenswrapper[4822]: I1010 06:38:48.868267 4822 generic.go:334] "Generic (PLEG): container finished" podID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerID="aa6471bfa5d5b46acd37eabe3d91116f7b9cbfc7206f624440fe7b9e1b9c43ad" exitCode=0 Oct 10 06:38:48 crc kubenswrapper[4822]: I1010 06:38:48.867997 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" event={"ID":"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9","Type":"ContainerDied","Data":"4570c767afe4694bc9378deaa436b25709dc62e094406b06dc73ff3003807c8a"} Oct 10 06:38:48 crc kubenswrapper[4822]: I1010 06:38:48.868306 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" event={"ID":"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9","Type":"ContainerDied","Data":"aa6471bfa5d5b46acd37eabe3d91116f7b9cbfc7206f624440fe7b9e1b9c43ad"} Oct 10 06:38:49 crc kubenswrapper[4822]: I1010 06:38:49.992915 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-l5c7z" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.017283 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xq8m5" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.137849 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.209359 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtsts\" (UniqueName: \"kubernetes.io/projected/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-kube-api-access-rtsts\") pod \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.209455 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-bundle\") pod \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.209516 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-util\") pod \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\" (UID: \"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9\") " Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.213978 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-bundle" (OuterVolumeSpecName: "bundle") pod "e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" (UID: "e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.218501 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-kube-api-access-rtsts" (OuterVolumeSpecName: "kube-api-access-rtsts") pod "e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" (UID: "e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9"). InnerVolumeSpecName "kube-api-access-rtsts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.283818 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-util" (OuterVolumeSpecName: "util") pod "e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" (UID: "e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.310561 4822 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-util\") on node \"crc\" DevicePath \"\"" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.310616 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtsts\" (UniqueName: \"kubernetes.io/projected/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-kube-api-access-rtsts\") on node \"crc\" DevicePath \"\"" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.310632 4822 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.883618 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" event={"ID":"e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9","Type":"ContainerDied","Data":"ddd6870071ef3b8aaadb5d5c11484cd44302e20d9d053c729e6df21da60b3ef1"} Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.883652 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd6870071ef3b8aaadb5d5c11484cd44302e20d9d053c729e6df21da60b3ef1" Oct 10 06:38:50 crc kubenswrapper[4822]: I1010 06:38:50.883689 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.037852 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82"] Oct 10 06:38:57 crc kubenswrapper[4822]: E1010 06:38:57.039286 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="pull" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.039309 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="pull" Oct 10 06:38:57 crc kubenswrapper[4822]: E1010 06:38:57.039329 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="util" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.039341 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="util" Oct 10 06:38:57 crc kubenswrapper[4822]: E1010 06:38:57.039363 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="extract" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.039375 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="extract" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.039761 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9" containerName="extract" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.040717 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.044744 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.047141 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.047173 4822 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-czt9x" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.057826 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82"] Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.100652 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg2d5\" (UniqueName: \"kubernetes.io/projected/885ecb95-2dc2-4a32-ba53-7f5e65842555-kube-api-access-jg2d5\") pod \"cert-manager-operator-controller-manager-57cd46d6d-lwc82\" (UID: \"885ecb95-2dc2-4a32-ba53-7f5e65842555\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.202048 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg2d5\" (UniqueName: \"kubernetes.io/projected/885ecb95-2dc2-4a32-ba53-7f5e65842555-kube-api-access-jg2d5\") pod \"cert-manager-operator-controller-manager-57cd46d6d-lwc82\" (UID: \"885ecb95-2dc2-4a32-ba53-7f5e65842555\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.234789 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg2d5\" (UniqueName: \"kubernetes.io/projected/885ecb95-2dc2-4a32-ba53-7f5e65842555-kube-api-access-jg2d5\") pod \"cert-manager-operator-controller-manager-57cd46d6d-lwc82\" (UID: \"885ecb95-2dc2-4a32-ba53-7f5e65842555\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.362666 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.563548 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82"] Oct 10 06:38:57 crc kubenswrapper[4822]: I1010 06:38:57.924461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" event={"ID":"885ecb95-2dc2-4a32-ba53-7f5e65842555","Type":"ContainerStarted","Data":"943f9db87bfbc68b9453cf9d1cc01dc03c7cedc8c87989a86de114a0fc95c850"} Oct 10 06:39:04 crc kubenswrapper[4822]: I1010 06:39:04.968352 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" event={"ID":"885ecb95-2dc2-4a32-ba53-7f5e65842555","Type":"ContainerStarted","Data":"9658a6b917db34067eaaa0469a0366c68942781a3ae38882e6d9cd991bf79eda"} Oct 10 06:39:04 crc kubenswrapper[4822]: I1010 06:39:04.993597 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lwc82" podStartSLOduration=1.367872728 podStartE2EDuration="7.99357934s" podCreationTimestamp="2025-10-10 06:38:57 +0000 UTC" firstStartedPulling="2025-10-10 06:38:57.573475701 +0000 UTC m=+884.668633897" lastFinishedPulling="2025-10-10 06:39:04.199182293 +0000 UTC m=+891.294340509" observedRunningTime="2025-10-10 06:39:04.99324061 +0000 UTC m=+892.088398846" watchObservedRunningTime="2025-10-10 06:39:04.99357934 +0000 UTC m=+892.088737536" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.330055 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-d8lf4"] Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.330896 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.332992 4822 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-t5fbp" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.333431 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.333608 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.343006 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-d8lf4"] Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.448831 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7h42\" (UniqueName: \"kubernetes.io/projected/046b28df-836a-482d-8387-f40aef735dce-kube-api-access-b7h42\") pod \"cert-manager-webhook-d969966f-d8lf4\" (UID: \"046b28df-836a-482d-8387-f40aef735dce\") " pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.449264 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/046b28df-836a-482d-8387-f40aef735dce-bound-sa-token\") pod \"cert-manager-webhook-d969966f-d8lf4\" (UID: \"046b28df-836a-482d-8387-f40aef735dce\") " pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.549846 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7h42\" (UniqueName: \"kubernetes.io/projected/046b28df-836a-482d-8387-f40aef735dce-kube-api-access-b7h42\") pod \"cert-manager-webhook-d969966f-d8lf4\" (UID: \"046b28df-836a-482d-8387-f40aef735dce\") " pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.549947 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/046b28df-836a-482d-8387-f40aef735dce-bound-sa-token\") pod \"cert-manager-webhook-d969966f-d8lf4\" (UID: \"046b28df-836a-482d-8387-f40aef735dce\") " pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.573947 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7h42\" (UniqueName: \"kubernetes.io/projected/046b28df-836a-482d-8387-f40aef735dce-kube-api-access-b7h42\") pod \"cert-manager-webhook-d969966f-d8lf4\" (UID: \"046b28df-836a-482d-8387-f40aef735dce\") " pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.578232 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/046b28df-836a-482d-8387-f40aef735dce-bound-sa-token\") pod \"cert-manager-webhook-d969966f-d8lf4\" (UID: \"046b28df-836a-482d-8387-f40aef735dce\") " pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:07 crc kubenswrapper[4822]: I1010 06:39:07.648540 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:08 crc kubenswrapper[4822]: I1010 06:39:08.038330 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-d8lf4"] Oct 10 06:39:08 crc kubenswrapper[4822]: W1010 06:39:08.053920 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod046b28df_836a_482d_8387_f40aef735dce.slice/crio-2dbec30b2ada6dfd160eafb047ccc7c673cb2921127f4b6eb5e162d9e369256c WatchSource:0}: Error finding container 2dbec30b2ada6dfd160eafb047ccc7c673cb2921127f4b6eb5e162d9e369256c: Status 404 returned error can't find the container with id 2dbec30b2ada6dfd160eafb047ccc7c673cb2921127f4b6eb5e162d9e369256c Oct 10 06:39:08 crc kubenswrapper[4822]: I1010 06:39:08.991815 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" event={"ID":"046b28df-836a-482d-8387-f40aef735dce","Type":"ContainerStarted","Data":"2dbec30b2ada6dfd160eafb047ccc7c673cb2921127f4b6eb5e162d9e369256c"} Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.173788 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx"] Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.175678 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.178105 4822 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jz2fb" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.180223 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx"] Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.308212 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jccp\" (UniqueName: \"kubernetes.io/projected/7d9258d7-7df3-4bb8-8190-7dcb1a930744-kube-api-access-7jccp\") pod \"cert-manager-cainjector-7d9f95dbf-lgjkx\" (UID: \"7d9258d7-7df3-4bb8-8190-7dcb1a930744\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.308276 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9258d7-7df3-4bb8-8190-7dcb1a930744-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lgjkx\" (UID: \"7d9258d7-7df3-4bb8-8190-7dcb1a930744\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.410059 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jccp\" (UniqueName: \"kubernetes.io/projected/7d9258d7-7df3-4bb8-8190-7dcb1a930744-kube-api-access-7jccp\") pod \"cert-manager-cainjector-7d9f95dbf-lgjkx\" (UID: \"7d9258d7-7df3-4bb8-8190-7dcb1a930744\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.410152 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9258d7-7df3-4bb8-8190-7dcb1a930744-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lgjkx\" (UID: \"7d9258d7-7df3-4bb8-8190-7dcb1a930744\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.438375 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jccp\" (UniqueName: \"kubernetes.io/projected/7d9258d7-7df3-4bb8-8190-7dcb1a930744-kube-api-access-7jccp\") pod \"cert-manager-cainjector-7d9f95dbf-lgjkx\" (UID: \"7d9258d7-7df3-4bb8-8190-7dcb1a930744\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.448289 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9258d7-7df3-4bb8-8190-7dcb1a930744-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lgjkx\" (UID: \"7d9258d7-7df3-4bb8-8190-7dcb1a930744\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:11 crc kubenswrapper[4822]: I1010 06:39:11.508192 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" Oct 10 06:39:12 crc kubenswrapper[4822]: I1010 06:39:12.215987 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx"] Oct 10 06:39:12 crc kubenswrapper[4822]: W1010 06:39:12.219521 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d9258d7_7df3_4bb8_8190_7dcb1a930744.slice/crio-c2a66b47f955dc092f6b7d7548666ce9f9d1a9f39d8098fcc30836955c2bfe6a WatchSource:0}: Error finding container c2a66b47f955dc092f6b7d7548666ce9f9d1a9f39d8098fcc30836955c2bfe6a: Status 404 returned error can't find the container with id c2a66b47f955dc092f6b7d7548666ce9f9d1a9f39d8098fcc30836955c2bfe6a Oct 10 06:39:13 crc kubenswrapper[4822]: I1010 06:39:13.012921 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" event={"ID":"046b28df-836a-482d-8387-f40aef735dce","Type":"ContainerStarted","Data":"77f8994ecbb0ad3f342beca7a832e7498bfd46563b2a7ea46eea449d2b18f1dd"} Oct 10 06:39:13 crc kubenswrapper[4822]: I1010 06:39:13.012995 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:13 crc kubenswrapper[4822]: I1010 06:39:13.014032 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" event={"ID":"7d9258d7-7df3-4bb8-8190-7dcb1a930744","Type":"ContainerStarted","Data":"bba479ab90956a19e8cb32022975c4c64dc4e56845f9b6628132b1ce7571573c"} Oct 10 06:39:13 crc kubenswrapper[4822]: I1010 06:39:13.014069 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" event={"ID":"7d9258d7-7df3-4bb8-8190-7dcb1a930744","Type":"ContainerStarted","Data":"c2a66b47f955dc092f6b7d7548666ce9f9d1a9f39d8098fcc30836955c2bfe6a"} Oct 10 06:39:13 crc kubenswrapper[4822]: I1010 06:39:13.026477 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" podStartSLOduration=2.189470572 podStartE2EDuration="6.026461093s" podCreationTimestamp="2025-10-10 06:39:07 +0000 UTC" firstStartedPulling="2025-10-10 06:39:08.056667194 +0000 UTC m=+895.151825420" lastFinishedPulling="2025-10-10 06:39:11.893657745 +0000 UTC m=+898.988815941" observedRunningTime="2025-10-10 06:39:13.025258818 +0000 UTC m=+900.120417024" watchObservedRunningTime="2025-10-10 06:39:13.026461093 +0000 UTC m=+900.121619289" Oct 10 06:39:13 crc kubenswrapper[4822]: I1010 06:39:13.042456 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lgjkx" podStartSLOduration=2.042438689 podStartE2EDuration="2.042438689s" podCreationTimestamp="2025-10-10 06:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:39:13.040175033 +0000 UTC m=+900.135333239" watchObservedRunningTime="2025-10-10 06:39:13.042438689 +0000 UTC m=+900.137596905" Oct 10 06:39:17 crc kubenswrapper[4822]: I1010 06:39:17.659502 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-d8lf4" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.276991 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pv7b6"] Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.278891 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.281527 4822 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-vw5qv" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.292224 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pv7b6"] Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.327843 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pv7b6\" (UID: \"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0\") " pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.327970 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgvfx\" (UniqueName: \"kubernetes.io/projected/7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0-kube-api-access-tgvfx\") pod \"cert-manager-7d4cc89fcb-pv7b6\" (UID: \"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0\") " pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.429206 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgvfx\" (UniqueName: \"kubernetes.io/projected/7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0-kube-api-access-tgvfx\") pod \"cert-manager-7d4cc89fcb-pv7b6\" (UID: \"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0\") " pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.429420 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pv7b6\" (UID: \"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0\") " pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.461564 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-pv7b6\" (UID: \"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0\") " pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.461897 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgvfx\" (UniqueName: \"kubernetes.io/projected/7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0-kube-api-access-tgvfx\") pod \"cert-manager-7d4cc89fcb-pv7b6\" (UID: \"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0\") " pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:27 crc kubenswrapper[4822]: I1010 06:39:27.601885 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" Oct 10 06:39:28 crc kubenswrapper[4822]: I1010 06:39:28.052333 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-pv7b6"] Oct 10 06:39:28 crc kubenswrapper[4822]: I1010 06:39:28.135070 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" event={"ID":"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0","Type":"ContainerStarted","Data":"35efd2bba8856626db8bf4a7672a25b88f38ac0054f8a45b8144f1806aa28a81"} Oct 10 06:39:29 crc kubenswrapper[4822]: I1010 06:39:29.142927 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" event={"ID":"7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0","Type":"ContainerStarted","Data":"04f271d6df18073b4b33873c13f15097be79a6a46706c1d66aba28629b3ba064"} Oct 10 06:39:29 crc kubenswrapper[4822]: I1010 06:39:29.167155 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-pv7b6" podStartSLOduration=2.167133311 podStartE2EDuration="2.167133311s" podCreationTimestamp="2025-10-10 06:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:39:29.162142827 +0000 UTC m=+916.257301023" watchObservedRunningTime="2025-10-10 06:39:29.167133311 +0000 UTC m=+916.262291527" Oct 10 06:39:31 crc kubenswrapper[4822]: I1010 06:39:31.336234 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:39:31 crc kubenswrapper[4822]: I1010 06:39:31.336293 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.246545 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zklk2"] Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.247337 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.252043 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-7cl85" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.252275 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.252424 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.284014 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zklk2"] Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.295904 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8c6\" (UniqueName: \"kubernetes.io/projected/29562fd3-1dd0-452b-934e-a822c3804b91-kube-api-access-hd8c6\") pod \"openstack-operator-index-zklk2\" (UID: \"29562fd3-1dd0-452b-934e-a822c3804b91\") " pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.396941 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8c6\" (UniqueName: \"kubernetes.io/projected/29562fd3-1dd0-452b-934e-a822c3804b91-kube-api-access-hd8c6\") pod \"openstack-operator-index-zklk2\" (UID: \"29562fd3-1dd0-452b-934e-a822c3804b91\") " pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.413906 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8c6\" (UniqueName: \"kubernetes.io/projected/29562fd3-1dd0-452b-934e-a822c3804b91-kube-api-access-hd8c6\") pod \"openstack-operator-index-zklk2\" (UID: \"29562fd3-1dd0-452b-934e-a822c3804b91\") " pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:32 crc kubenswrapper[4822]: I1010 06:39:32.573605 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:33 crc kubenswrapper[4822]: I1010 06:39:33.004193 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zklk2"] Oct 10 06:39:33 crc kubenswrapper[4822]: I1010 06:39:33.173668 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zklk2" event={"ID":"29562fd3-1dd0-452b-934e-a822c3804b91","Type":"ContainerStarted","Data":"768880b68ae803134597ee2c1471534b1be3a92bfbf07d4e9363ada7dcf109b8"} Oct 10 06:39:34 crc kubenswrapper[4822]: I1010 06:39:34.629400 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zklk2"] Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.035113 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7wjsb"] Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.036872 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.051417 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7wjsb"] Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.131480 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsgq\" (UniqueName: \"kubernetes.io/projected/f2d285c1-f470-4e43-a470-1bfad25e8ee8-kube-api-access-mhsgq\") pod \"openstack-operator-index-7wjsb\" (UID: \"f2d285c1-f470-4e43-a470-1bfad25e8ee8\") " pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.187295 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zklk2" event={"ID":"29562fd3-1dd0-452b-934e-a822c3804b91","Type":"ContainerStarted","Data":"4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a"} Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.187446 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-zklk2" podUID="29562fd3-1dd0-452b-934e-a822c3804b91" containerName="registry-server" containerID="cri-o://4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a" gracePeriod=2 Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.208363 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zklk2" podStartSLOduration=1.261794141 podStartE2EDuration="3.208345276s" podCreationTimestamp="2025-10-10 06:39:32 +0000 UTC" firstStartedPulling="2025-10-10 06:39:33.010657257 +0000 UTC m=+920.105815453" lastFinishedPulling="2025-10-10 06:39:34.957208392 +0000 UTC m=+922.052366588" observedRunningTime="2025-10-10 06:39:35.203460945 +0000 UTC m=+922.298619161" watchObservedRunningTime="2025-10-10 06:39:35.208345276 +0000 UTC m=+922.303503472" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.232336 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsgq\" (UniqueName: \"kubernetes.io/projected/f2d285c1-f470-4e43-a470-1bfad25e8ee8-kube-api-access-mhsgq\") pod \"openstack-operator-index-7wjsb\" (UID: \"f2d285c1-f470-4e43-a470-1bfad25e8ee8\") " pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.250249 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsgq\" (UniqueName: \"kubernetes.io/projected/f2d285c1-f470-4e43-a470-1bfad25e8ee8-kube-api-access-mhsgq\") pod \"openstack-operator-index-7wjsb\" (UID: \"f2d285c1-f470-4e43-a470-1bfad25e8ee8\") " pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.380109 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.525116 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.637565 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd8c6\" (UniqueName: \"kubernetes.io/projected/29562fd3-1dd0-452b-934e-a822c3804b91-kube-api-access-hd8c6\") pod \"29562fd3-1dd0-452b-934e-a822c3804b91\" (UID: \"29562fd3-1dd0-452b-934e-a822c3804b91\") " Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.641794 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29562fd3-1dd0-452b-934e-a822c3804b91-kube-api-access-hd8c6" (OuterVolumeSpecName: "kube-api-access-hd8c6") pod "29562fd3-1dd0-452b-934e-a822c3804b91" (UID: "29562fd3-1dd0-452b-934e-a822c3804b91"). InnerVolumeSpecName "kube-api-access-hd8c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.739844 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd8c6\" (UniqueName: \"kubernetes.io/projected/29562fd3-1dd0-452b-934e-a822c3804b91-kube-api-access-hd8c6\") on node \"crc\" DevicePath \"\"" Oct 10 06:39:35 crc kubenswrapper[4822]: I1010 06:39:35.839200 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7wjsb"] Oct 10 06:39:35 crc kubenswrapper[4822]: W1010 06:39:35.842017 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2d285c1_f470_4e43_a470_1bfad25e8ee8.slice/crio-cf062873d3b68bfee90ebe17fe2ee7b93e78acf4e3789b4629f5f89dfa9fea12 WatchSource:0}: Error finding container cf062873d3b68bfee90ebe17fe2ee7b93e78acf4e3789b4629f5f89dfa9fea12: Status 404 returned error can't find the container with id cf062873d3b68bfee90ebe17fe2ee7b93e78acf4e3789b4629f5f89dfa9fea12 Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.194945 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7wjsb" event={"ID":"f2d285c1-f470-4e43-a470-1bfad25e8ee8","Type":"ContainerStarted","Data":"7bfa48c5692f404be170b8c0d494346c99a847dc6874b902a95a8f4b3677f789"} Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.195309 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7wjsb" event={"ID":"f2d285c1-f470-4e43-a470-1bfad25e8ee8","Type":"ContainerStarted","Data":"cf062873d3b68bfee90ebe17fe2ee7b93e78acf4e3789b4629f5f89dfa9fea12"} Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.196409 4822 generic.go:334] "Generic (PLEG): container finished" podID="29562fd3-1dd0-452b-934e-a822c3804b91" containerID="4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a" exitCode=0 Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.196453 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zklk2" event={"ID":"29562fd3-1dd0-452b-934e-a822c3804b91","Type":"ContainerDied","Data":"4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a"} Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.196471 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zklk2" Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.196496 4822 scope.go:117] "RemoveContainer" containerID="4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a" Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.196482 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zklk2" event={"ID":"29562fd3-1dd0-452b-934e-a822c3804b91","Type":"ContainerDied","Data":"768880b68ae803134597ee2c1471534b1be3a92bfbf07d4e9363ada7dcf109b8"} Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.222879 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7wjsb" podStartSLOduration=1.166051539 podStartE2EDuration="1.22285922s" podCreationTimestamp="2025-10-10 06:39:35 +0000 UTC" firstStartedPulling="2025-10-10 06:39:35.845254753 +0000 UTC m=+922.940412959" lastFinishedPulling="2025-10-10 06:39:35.902062414 +0000 UTC m=+922.997220640" observedRunningTime="2025-10-10 06:39:36.220551043 +0000 UTC m=+923.315709229" watchObservedRunningTime="2025-10-10 06:39:36.22285922 +0000 UTC m=+923.318017426" Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.237241 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zklk2"] Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.240507 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-zklk2"] Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.241458 4822 scope.go:117] "RemoveContainer" containerID="4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a" Oct 10 06:39:36 crc kubenswrapper[4822]: E1010 06:39:36.242083 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a\": container with ID starting with 4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a not found: ID does not exist" containerID="4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a" Oct 10 06:39:36 crc kubenswrapper[4822]: I1010 06:39:36.242118 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a"} err="failed to get container status \"4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a\": rpc error: code = NotFound desc = could not find container \"4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a\": container with ID starting with 4b2a23fc997e1ba845f0259bfadc7b1d8e9a9118aa8d90d7a743468d431f6c8a not found: ID does not exist" Oct 10 06:39:37 crc kubenswrapper[4822]: I1010 06:39:37.664081 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29562fd3-1dd0-452b-934e-a822c3804b91" path="/var/lib/kubelet/pods/29562fd3-1dd0-452b-934e-a822c3804b91/volumes" Oct 10 06:39:45 crc kubenswrapper[4822]: I1010 06:39:45.381150 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:45 crc kubenswrapper[4822]: I1010 06:39:45.381756 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:45 crc kubenswrapper[4822]: I1010 06:39:45.411960 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:46 crc kubenswrapper[4822]: I1010 06:39:46.304688 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7wjsb" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.424374 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd"] Oct 10 06:39:52 crc kubenswrapper[4822]: E1010 06:39:52.424894 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29562fd3-1dd0-452b-934e-a822c3804b91" containerName="registry-server" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.424911 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="29562fd3-1dd0-452b-934e-a822c3804b91" containerName="registry-server" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.425150 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="29562fd3-1dd0-452b-934e-a822c3804b91" containerName="registry-server" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.426545 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.428872 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-k6m4f" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.430468 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd"] Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.477714 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-util\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.477888 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h965b\" (UniqueName: \"kubernetes.io/projected/1798b7f5-76e8-4545-a64e-e05a472f0eac-kube-api-access-h965b\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.477912 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-bundle\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.578952 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-util\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.579034 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h965b\" (UniqueName: \"kubernetes.io/projected/1798b7f5-76e8-4545-a64e-e05a472f0eac-kube-api-access-h965b\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.579058 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-bundle\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.579477 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-bundle\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.579497 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-util\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.598692 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h965b\" (UniqueName: \"kubernetes.io/projected/1798b7f5-76e8-4545-a64e-e05a472f0eac-kube-api-access-h965b\") pod \"d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:52 crc kubenswrapper[4822]: I1010 06:39:52.742733 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:53 crc kubenswrapper[4822]: I1010 06:39:53.013554 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd"] Oct 10 06:39:53 crc kubenswrapper[4822]: I1010 06:39:53.318529 4822 generic.go:334] "Generic (PLEG): container finished" podID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerID="b695c2caecd4c4186941da7ac3718d304c5598fe22a4ba4ce054f99b06f8345e" exitCode=0 Oct 10 06:39:53 crc kubenswrapper[4822]: I1010 06:39:53.318587 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" event={"ID":"1798b7f5-76e8-4545-a64e-e05a472f0eac","Type":"ContainerDied","Data":"b695c2caecd4c4186941da7ac3718d304c5598fe22a4ba4ce054f99b06f8345e"} Oct 10 06:39:53 crc kubenswrapper[4822]: I1010 06:39:53.318624 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" event={"ID":"1798b7f5-76e8-4545-a64e-e05a472f0eac","Type":"ContainerStarted","Data":"42dbe2efcd274a1442fc40be7ec4737dd30e7ff641ce5ba08b7afe081484f524"} Oct 10 06:39:54 crc kubenswrapper[4822]: I1010 06:39:54.330224 4822 generic.go:334] "Generic (PLEG): container finished" podID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerID="76c103440067c3e8bc55573a496afcc47e8e27330a25bfba1ae14eb5af280de1" exitCode=0 Oct 10 06:39:54 crc kubenswrapper[4822]: I1010 06:39:54.330341 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" event={"ID":"1798b7f5-76e8-4545-a64e-e05a472f0eac","Type":"ContainerDied","Data":"76c103440067c3e8bc55573a496afcc47e8e27330a25bfba1ae14eb5af280de1"} Oct 10 06:39:55 crc kubenswrapper[4822]: I1010 06:39:55.339143 4822 generic.go:334] "Generic (PLEG): container finished" podID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerID="e12ddc35bdd641b12825df4ae479853cd544bd168d802f46576ecb4dbea34bad" exitCode=0 Oct 10 06:39:55 crc kubenswrapper[4822]: I1010 06:39:55.339217 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" event={"ID":"1798b7f5-76e8-4545-a64e-e05a472f0eac","Type":"ContainerDied","Data":"e12ddc35bdd641b12825df4ae479853cd544bd168d802f46576ecb4dbea34bad"} Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.643408 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.732611 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h965b\" (UniqueName: \"kubernetes.io/projected/1798b7f5-76e8-4545-a64e-e05a472f0eac-kube-api-access-h965b\") pod \"1798b7f5-76e8-4545-a64e-e05a472f0eac\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.732690 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-util\") pod \"1798b7f5-76e8-4545-a64e-e05a472f0eac\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.732738 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-bundle\") pod \"1798b7f5-76e8-4545-a64e-e05a472f0eac\" (UID: \"1798b7f5-76e8-4545-a64e-e05a472f0eac\") " Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.734855 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-bundle" (OuterVolumeSpecName: "bundle") pod "1798b7f5-76e8-4545-a64e-e05a472f0eac" (UID: "1798b7f5-76e8-4545-a64e-e05a472f0eac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.741881 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1798b7f5-76e8-4545-a64e-e05a472f0eac-kube-api-access-h965b" (OuterVolumeSpecName: "kube-api-access-h965b") pod "1798b7f5-76e8-4545-a64e-e05a472f0eac" (UID: "1798b7f5-76e8-4545-a64e-e05a472f0eac"). InnerVolumeSpecName "kube-api-access-h965b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.762451 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-util" (OuterVolumeSpecName: "util") pod "1798b7f5-76e8-4545-a64e-e05a472f0eac" (UID: "1798b7f5-76e8-4545-a64e-e05a472f0eac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.834850 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h965b\" (UniqueName: \"kubernetes.io/projected/1798b7f5-76e8-4545-a64e-e05a472f0eac-kube-api-access-h965b\") on node \"crc\" DevicePath \"\"" Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.834896 4822 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-util\") on node \"crc\" DevicePath \"\"" Oct 10 06:39:56 crc kubenswrapper[4822]: I1010 06:39:56.834909 4822 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1798b7f5-76e8-4545-a64e-e05a472f0eac-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:39:57 crc kubenswrapper[4822]: I1010 06:39:57.357601 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" event={"ID":"1798b7f5-76e8-4545-a64e-e05a472f0eac","Type":"ContainerDied","Data":"42dbe2efcd274a1442fc40be7ec4737dd30e7ff641ce5ba08b7afe081484f524"} Oct 10 06:39:57 crc kubenswrapper[4822]: I1010 06:39:57.357880 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd" Oct 10 06:39:57 crc kubenswrapper[4822]: I1010 06:39:57.357883 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42dbe2efcd274a1442fc40be7ec4737dd30e7ff641ce5ba08b7afe081484f524" Oct 10 06:40:01 crc kubenswrapper[4822]: I1010 06:40:01.336292 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:40:01 crc kubenswrapper[4822]: I1010 06:40:01.336913 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.864249 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr"] Oct 10 06:40:04 crc kubenswrapper[4822]: E1010 06:40:04.864617 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="util" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.864637 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="util" Oct 10 06:40:04 crc kubenswrapper[4822]: E1010 06:40:04.864656 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="pull" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.864670 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="pull" Oct 10 06:40:04 crc kubenswrapper[4822]: E1010 06:40:04.864691 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="extract" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.864705 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="extract" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.864945 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1798b7f5-76e8-4545-a64e-e05a472f0eac" containerName="extract" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.866043 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.868191 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gm99j" Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.888975 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr"] Oct 10 06:40:04 crc kubenswrapper[4822]: I1010 06:40:04.961551 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxx7\" (UniqueName: \"kubernetes.io/projected/9d259a92-f5da-477f-921c-274e9d77cd01-kube-api-access-tjxx7\") pod \"openstack-operator-controller-operator-8485b86f76-68qqr\" (UID: \"9d259a92-f5da-477f-921c-274e9d77cd01\") " pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:05 crc kubenswrapper[4822]: I1010 06:40:05.062664 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjxx7\" (UniqueName: \"kubernetes.io/projected/9d259a92-f5da-477f-921c-274e9d77cd01-kube-api-access-tjxx7\") pod \"openstack-operator-controller-operator-8485b86f76-68qqr\" (UID: \"9d259a92-f5da-477f-921c-274e9d77cd01\") " pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:05 crc kubenswrapper[4822]: I1010 06:40:05.086680 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjxx7\" (UniqueName: \"kubernetes.io/projected/9d259a92-f5da-477f-921c-274e9d77cd01-kube-api-access-tjxx7\") pod \"openstack-operator-controller-operator-8485b86f76-68qqr\" (UID: \"9d259a92-f5da-477f-921c-274e9d77cd01\") " pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:05 crc kubenswrapper[4822]: I1010 06:40:05.185773 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:05 crc kubenswrapper[4822]: I1010 06:40:05.604550 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr"] Oct 10 06:40:06 crc kubenswrapper[4822]: I1010 06:40:06.433179 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" event={"ID":"9d259a92-f5da-477f-921c-274e9d77cd01","Type":"ContainerStarted","Data":"32c21c6f91ab497b976e1e74c0c6f3c11b30f21b62fba69ea02b9e126cc0d711"} Oct 10 06:40:10 crc kubenswrapper[4822]: I1010 06:40:10.467313 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" event={"ID":"9d259a92-f5da-477f-921c-274e9d77cd01","Type":"ContainerStarted","Data":"495eb40318a2a944332cd8c9b853dc8b417672305cb4be5bd7918335967184d4"} Oct 10 06:40:12 crc kubenswrapper[4822]: I1010 06:40:12.481046 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" event={"ID":"9d259a92-f5da-477f-921c-274e9d77cd01","Type":"ContainerStarted","Data":"ea5e36e73246e63dee28e0ccb26f5e2951963f3ee5a61bbf4c1f213cdd716f3c"} Oct 10 06:40:12 crc kubenswrapper[4822]: I1010 06:40:12.481323 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:12 crc kubenswrapper[4822]: I1010 06:40:12.518370 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" podStartSLOduration=2.354544145 podStartE2EDuration="8.518341939s" podCreationTimestamp="2025-10-10 06:40:04 +0000 UTC" firstStartedPulling="2025-10-10 06:40:05.609642929 +0000 UTC m=+952.704801125" lastFinishedPulling="2025-10-10 06:40:11.773440723 +0000 UTC m=+958.868598919" observedRunningTime="2025-10-10 06:40:12.511507252 +0000 UTC m=+959.606665478" watchObservedRunningTime="2025-10-10 06:40:12.518341939 +0000 UTC m=+959.613500165" Oct 10 06:40:15 crc kubenswrapper[4822]: I1010 06:40:15.191844 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-8485b86f76-68qqr" Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.336379 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.337041 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.337104 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.338063 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3148af7555cf9f4072513a4f7349d4cc748c64df0fab673b49a83ef0fc2fe122"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.338159 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://3148af7555cf9f4072513a4f7349d4cc748c64df0fab673b49a83ef0fc2fe122" gracePeriod=600 Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.600177 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="3148af7555cf9f4072513a4f7349d4cc748c64df0fab673b49a83ef0fc2fe122" exitCode=0 Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.600456 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"3148af7555cf9f4072513a4f7349d4cc748c64df0fab673b49a83ef0fc2fe122"} Oct 10 06:40:31 crc kubenswrapper[4822]: I1010 06:40:31.600486 4822 scope.go:117] "RemoveContainer" containerID="58cc4a6333405580cf6ed60b3760010cf9fb05805218283cbce7d19149e2db60" Oct 10 06:40:32 crc kubenswrapper[4822]: I1010 06:40:32.615325 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"c43ffe7835942c8ad421a26f02d63196eea012c628c464759216b8f8a59f7812"} Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.862474 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.864985 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.869141 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.870554 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.872152 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qm9j7" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.872850 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7m259" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.872846 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.892447 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.893741 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.896534 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hvj5c" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.908047 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.928899 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.930361 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.932440 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.935417 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ql7wk" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.936960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxh5h\" (UniqueName: \"kubernetes.io/projected/e757a212-d95c-4ffc-ae84-ceca5cc56cc2-kube-api-access-zxh5h\") pod \"cinder-operator-controller-manager-59cdc64769-2h5rm\" (UID: \"e757a212-d95c-4ffc-ae84-ceca5cc56cc2\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.937016 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjs85\" (UniqueName: \"kubernetes.io/projected/be44c9df-65d4-4a6a-8646-7687f601f6b6-kube-api-access-xjs85\") pod \"designate-operator-controller-manager-687df44cdb-j9ksb\" (UID: \"be44c9df-65d4-4a6a-8646-7687f601f6b6\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.937037 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wc4\" (UniqueName: \"kubernetes.io/projected/a21a0dcd-c8b2-4ea4-ab7e-edae527ab347-kube-api-access-v8wc4\") pod \"barbican-operator-controller-manager-64f84fcdbb-wpwlg\" (UID: \"a21a0dcd-c8b2-4ea4-ab7e-edae527ab347\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.948528 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.952157 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55"] Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.953726 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.955422 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lmhr7" Oct 10 06:40:39 crc kubenswrapper[4822]: I1010 06:40:39.976822 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.001439 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.002627 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.045560 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mrh4j" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.052138 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgnd\" (UniqueName: \"kubernetes.io/projected/3e087283-f802-4b9c-9f1f-bbca4e30a892-kube-api-access-tzgnd\") pod \"glance-operator-controller-manager-7bb46cd7d-76qx8\" (UID: \"3e087283-f802-4b9c-9f1f-bbca4e30a892\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.052201 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxh5h\" (UniqueName: \"kubernetes.io/projected/e757a212-d95c-4ffc-ae84-ceca5cc56cc2-kube-api-access-zxh5h\") pod \"cinder-operator-controller-manager-59cdc64769-2h5rm\" (UID: \"e757a212-d95c-4ffc-ae84-ceca5cc56cc2\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.052233 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clkfg\" (UniqueName: \"kubernetes.io/projected/9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8-kube-api-access-clkfg\") pod \"horizon-operator-controller-manager-6d74794d9b-gmgqr\" (UID: \"9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.052274 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wc4\" (UniqueName: \"kubernetes.io/projected/a21a0dcd-c8b2-4ea4-ab7e-edae527ab347-kube-api-access-v8wc4\") pod \"barbican-operator-controller-manager-64f84fcdbb-wpwlg\" (UID: \"a21a0dcd-c8b2-4ea4-ab7e-edae527ab347\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.052296 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjs85\" (UniqueName: \"kubernetes.io/projected/be44c9df-65d4-4a6a-8646-7687f601f6b6-kube-api-access-xjs85\") pod \"designate-operator-controller-manager-687df44cdb-j9ksb\" (UID: \"be44c9df-65d4-4a6a-8646-7687f601f6b6\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.052358 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnt22\" (UniqueName: \"kubernetes.io/projected/367aa79b-9342-431b-ade4-a9195844ce4a-kube-api-access-hnt22\") pod \"heat-operator-controller-manager-6d9967f8dd-grq55\" (UID: \"367aa79b-9342-431b-ade4-a9195844ce4a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.076032 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjs85\" (UniqueName: \"kubernetes.io/projected/be44c9df-65d4-4a6a-8646-7687f601f6b6-kube-api-access-xjs85\") pod \"designate-operator-controller-manager-687df44cdb-j9ksb\" (UID: \"be44c9df-65d4-4a6a-8646-7687f601f6b6\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.082979 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.084207 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxh5h\" (UniqueName: \"kubernetes.io/projected/e757a212-d95c-4ffc-ae84-ceca5cc56cc2-kube-api-access-zxh5h\") pod \"cinder-operator-controller-manager-59cdc64769-2h5rm\" (UID: \"e757a212-d95c-4ffc-ae84-ceca5cc56cc2\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.097318 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.098645 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.107096 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.107479 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ccn64" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.111374 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.122937 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.124050 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.128485 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wc4\" (UniqueName: \"kubernetes.io/projected/a21a0dcd-c8b2-4ea4-ab7e-edae527ab347-kube-api-access-v8wc4\") pod \"barbican-operator-controller-manager-64f84fcdbb-wpwlg\" (UID: \"a21a0dcd-c8b2-4ea4-ab7e-edae527ab347\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.140090 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-s8djz" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.153282 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.154093 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-cert\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.154141 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgnd\" (UniqueName: \"kubernetes.io/projected/3e087283-f802-4b9c-9f1f-bbca4e30a892-kube-api-access-tzgnd\") pod \"glance-operator-controller-manager-7bb46cd7d-76qx8\" (UID: \"3e087283-f802-4b9c-9f1f-bbca4e30a892\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.154177 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clkfg\" (UniqueName: \"kubernetes.io/projected/9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8-kube-api-access-clkfg\") pod \"horizon-operator-controller-manager-6d74794d9b-gmgqr\" (UID: \"9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.154205 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdzqj\" (UniqueName: \"kubernetes.io/projected/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-kube-api-access-hdzqj\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.154265 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnt22\" (UniqueName: \"kubernetes.io/projected/367aa79b-9342-431b-ade4-a9195844ce4a-kube-api-access-hnt22\") pod \"heat-operator-controller-manager-6d9967f8dd-grq55\" (UID: \"367aa79b-9342-431b-ade4-a9195844ce4a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.161913 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.163424 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.167121 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-f4x6h" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.170870 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.178327 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.179813 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.185208 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tmwt5" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.188153 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.192912 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.195297 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgnd\" (UniqueName: \"kubernetes.io/projected/3e087283-f802-4b9c-9f1f-bbca4e30a892-kube-api-access-tzgnd\") pod \"glance-operator-controller-manager-7bb46cd7d-76qx8\" (UID: \"3e087283-f802-4b9c-9f1f-bbca4e30a892\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.197444 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.208842 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.210894 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.214078 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clkfg\" (UniqueName: \"kubernetes.io/projected/9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8-kube-api-access-clkfg\") pod \"horizon-operator-controller-manager-6d74794d9b-gmgqr\" (UID: \"9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.214533 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.218500 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnt22\" (UniqueName: \"kubernetes.io/projected/367aa79b-9342-431b-ade4-a9195844ce4a-kube-api-access-hnt22\") pod \"heat-operator-controller-manager-6d9967f8dd-grq55\" (UID: \"367aa79b-9342-431b-ade4-a9195844ce4a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.221323 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xkx29" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.247412 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.248522 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.251089 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.255072 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-cert\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.255129 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwk2\" (UniqueName: \"kubernetes.io/projected/f56e0976-eb7a-4bcf-bde2-016c83567fc6-kube-api-access-jpwk2\") pod \"ironic-operator-controller-manager-74cb5cbc49-tbj65\" (UID: \"f56e0976-eb7a-4bcf-bde2-016c83567fc6\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.255187 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdzqj\" (UniqueName: \"kubernetes.io/projected/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-kube-api-access-hdzqj\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.255223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r4hc\" (UniqueName: \"kubernetes.io/projected/1ff50152-dd82-48ae-bca4-150c0f892185-kube-api-access-9r4hc\") pod \"mariadb-operator-controller-manager-5777b4f897-dt47l\" (UID: \"1ff50152-dd82-48ae-bca4-150c0f892185\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.255259 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl9mk\" (UniqueName: \"kubernetes.io/projected/49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42-kube-api-access-jl9mk\") pod \"keystone-operator-controller-manager-ddb98f99b-4h8gw\" (UID: \"49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.255302 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwnh\" (UniqueName: \"kubernetes.io/projected/6133aeb2-a9e5-4170-a6e1-b562cdb97975-kube-api-access-xcwnh\") pod \"manila-operator-controller-manager-59578bc799-2fwqj\" (UID: \"6133aeb2-a9e5-4170-a6e1-b562cdb97975\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:40:40 crc kubenswrapper[4822]: E1010 06:40:40.255456 4822 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 10 06:40:40 crc kubenswrapper[4822]: E1010 06:40:40.255513 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-cert podName:f3f904c2-4da1-46c2-83c6-2ba18d9ccc50 nodeName:}" failed. No retries permitted until 2025-10-10 06:40:40.755490594 +0000 UTC m=+987.850648790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-cert") pod "infra-operator-controller-manager-585fc5b659-rtgvd" (UID: "f3f904c2-4da1-46c2-83c6-2ba18d9ccc50") : secret "infra-operator-webhook-server-cert" not found Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.262676 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pghf5" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.270878 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.279573 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.290226 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.291576 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.299248 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.302202 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-g2mhf" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.317049 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.324157 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdzqj\" (UniqueName: \"kubernetes.io/projected/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-kube-api-access-hdzqj\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.335454 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.340970 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tc6nk" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.349563 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.356928 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl9mk\" (UniqueName: \"kubernetes.io/projected/49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42-kube-api-access-jl9mk\") pod \"keystone-operator-controller-manager-ddb98f99b-4h8gw\" (UID: \"49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.357009 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwnh\" (UniqueName: \"kubernetes.io/projected/6133aeb2-a9e5-4170-a6e1-b562cdb97975-kube-api-access-xcwnh\") pod \"manila-operator-controller-manager-59578bc799-2fwqj\" (UID: \"6133aeb2-a9e5-4170-a6e1-b562cdb97975\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.357084 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwk2\" (UniqueName: \"kubernetes.io/projected/f56e0976-eb7a-4bcf-bde2-016c83567fc6-kube-api-access-jpwk2\") pod \"ironic-operator-controller-manager-74cb5cbc49-tbj65\" (UID: \"f56e0976-eb7a-4bcf-bde2-016c83567fc6\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.357125 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77w6t\" (UniqueName: \"kubernetes.io/projected/8c66ad3b-7962-4747-97d4-a2c183d25ebc-kube-api-access-77w6t\") pod \"octavia-operator-controller-manager-6d7c7ddf95-7d77l\" (UID: \"8c66ad3b-7962-4747-97d4-a2c183d25ebc\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.357160 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5kwp\" (UniqueName: \"kubernetes.io/projected/111bbcd0-554c-4705-a874-1d3aa399a391-kube-api-access-s5kwp\") pod \"nova-operator-controller-manager-57bb74c7bf-k8fww\" (UID: \"111bbcd0-554c-4705-a874-1d3aa399a391\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.357205 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r4hc\" (UniqueName: \"kubernetes.io/projected/1ff50152-dd82-48ae-bca4-150c0f892185-kube-api-access-9r4hc\") pod \"mariadb-operator-controller-manager-5777b4f897-dt47l\" (UID: \"1ff50152-dd82-48ae-bca4-150c0f892185\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.357231 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pmnc\" (UniqueName: \"kubernetes.io/projected/8d454f08-e347-4e39-8392-9c5b4a2a8f6b-kube-api-access-4pmnc\") pod \"neutron-operator-controller-manager-797d478b46-rvhj6\" (UID: \"8d454f08-e347-4e39-8392-9c5b4a2a8f6b\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.361424 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.375762 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwnh\" (UniqueName: \"kubernetes.io/projected/6133aeb2-a9e5-4170-a6e1-b562cdb97975-kube-api-access-xcwnh\") pod \"manila-operator-controller-manager-59578bc799-2fwqj\" (UID: \"6133aeb2-a9e5-4170-a6e1-b562cdb97975\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.383627 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl9mk\" (UniqueName: \"kubernetes.io/projected/49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42-kube-api-access-jl9mk\") pod \"keystone-operator-controller-manager-ddb98f99b-4h8gw\" (UID: \"49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.384546 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r4hc\" (UniqueName: \"kubernetes.io/projected/1ff50152-dd82-48ae-bca4-150c0f892185-kube-api-access-9r4hc\") pod \"mariadb-operator-controller-manager-5777b4f897-dt47l\" (UID: \"1ff50152-dd82-48ae-bca4-150c0f892185\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.392924 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwk2\" (UniqueName: \"kubernetes.io/projected/f56e0976-eb7a-4bcf-bde2-016c83567fc6-kube-api-access-jpwk2\") pod \"ironic-operator-controller-manager-74cb5cbc49-tbj65\" (UID: \"f56e0976-eb7a-4bcf-bde2-016c83567fc6\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.416880 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.442464 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.444944 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.447882 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-thtkk" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.450050 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.458492 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77w6t\" (UniqueName: \"kubernetes.io/projected/8c66ad3b-7962-4747-97d4-a2c183d25ebc-kube-api-access-77w6t\") pod \"octavia-operator-controller-manager-6d7c7ddf95-7d77l\" (UID: \"8c66ad3b-7962-4747-97d4-a2c183d25ebc\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.458542 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5kwp\" (UniqueName: \"kubernetes.io/projected/111bbcd0-554c-4705-a874-1d3aa399a391-kube-api-access-s5kwp\") pod \"nova-operator-controller-manager-57bb74c7bf-k8fww\" (UID: \"111bbcd0-554c-4705-a874-1d3aa399a391\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.487416 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77w6t\" (UniqueName: \"kubernetes.io/projected/8c66ad3b-7962-4747-97d4-a2c183d25ebc-kube-api-access-77w6t\") pod \"octavia-operator-controller-manager-6d7c7ddf95-7d77l\" (UID: \"8c66ad3b-7962-4747-97d4-a2c183d25ebc\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.458604 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pmnc\" (UniqueName: \"kubernetes.io/projected/8d454f08-e347-4e39-8392-9c5b4a2a8f6b-kube-api-access-4pmnc\") pod \"neutron-operator-controller-manager-797d478b46-rvhj6\" (UID: \"8d454f08-e347-4e39-8392-9c5b4a2a8f6b\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.497243 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5kwp\" (UniqueName: \"kubernetes.io/projected/111bbcd0-554c-4705-a874-1d3aa399a391-kube-api-access-s5kwp\") pod \"nova-operator-controller-manager-57bb74c7bf-k8fww\" (UID: \"111bbcd0-554c-4705-a874-1d3aa399a391\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.497379 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pmnc\" (UniqueName: \"kubernetes.io/projected/8d454f08-e347-4e39-8392-9c5b4a2a8f6b-kube-api-access-4pmnc\") pod \"neutron-operator-controller-manager-797d478b46-rvhj6\" (UID: \"8d454f08-e347-4e39-8392-9c5b4a2a8f6b\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.497385 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.501125 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.503618 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n699j" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.513205 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.514613 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.524901 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-njhrr" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.525517 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.551477 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.554067 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.565078 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.588888 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.592545 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tjt\" (UniqueName: \"kubernetes.io/projected/b923e24a-92ce-4c4a-8c26-d4fe2b1563ad-kube-api-access-24tjt\") pod \"ovn-operator-controller-manager-869cc7797f-tb22w\" (UID: \"b923e24a-92ce-4c4a-8c26-d4fe2b1563ad\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.592636 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95z6m\" (UniqueName: \"kubernetes.io/projected/04dda440-ebd4-412b-9a18-655a9721229d-kube-api-access-95z6m\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.592669 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8gc\" (UniqueName: \"kubernetes.io/projected/0da053a2-94d4-41de-87a8-d7f2662d9b5b-kube-api-access-ff8gc\") pod \"placement-operator-controller-manager-664664cb68-9q6b5\" (UID: \"0da053a2-94d4-41de-87a8-d7f2662d9b5b\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.592765 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.617009 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.626768 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.638656 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.640479 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8grtk" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.649588 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.676086 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.694098 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.694317 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24tjt\" (UniqueName: \"kubernetes.io/projected/b923e24a-92ce-4c4a-8c26-d4fe2b1563ad-kube-api-access-24tjt\") pod \"ovn-operator-controller-manager-869cc7797f-tb22w\" (UID: \"b923e24a-92ce-4c4a-8c26-d4fe2b1563ad\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.694407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kfv\" (UniqueName: \"kubernetes.io/projected/9544ed2b-6308-4681-a120-d134ee029ded-kube-api-access-c2kfv\") pod \"swift-operator-controller-manager-5f4d5dfdc6-b5g5r\" (UID: \"9544ed2b-6308-4681-a120-d134ee029ded\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.694503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95z6m\" (UniqueName: \"kubernetes.io/projected/04dda440-ebd4-412b-9a18-655a9721229d-kube-api-access-95z6m\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.694585 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8gc\" (UniqueName: \"kubernetes.io/projected/0da053a2-94d4-41de-87a8-d7f2662d9b5b-kube-api-access-ff8gc\") pod \"placement-operator-controller-manager-664664cb68-9q6b5\" (UID: \"0da053a2-94d4-41de-87a8-d7f2662d9b5b\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:40:40 crc kubenswrapper[4822]: E1010 06:40:40.694835 4822 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 10 06:40:40 crc kubenswrapper[4822]: E1010 06:40:40.694954 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert podName:04dda440-ebd4-412b-9a18-655a9721229d nodeName:}" failed. No retries permitted until 2025-10-10 06:40:41.194932895 +0000 UTC m=+988.290091091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" (UID: "04dda440-ebd4-412b-9a18-655a9721229d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.727884 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8gc\" (UniqueName: \"kubernetes.io/projected/0da053a2-94d4-41de-87a8-d7f2662d9b5b-kube-api-access-ff8gc\") pod \"placement-operator-controller-manager-664664cb68-9q6b5\" (UID: \"0da053a2-94d4-41de-87a8-d7f2662d9b5b\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.729907 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tjt\" (UniqueName: \"kubernetes.io/projected/b923e24a-92ce-4c4a-8c26-d4fe2b1563ad-kube-api-access-24tjt\") pod \"ovn-operator-controller-manager-869cc7797f-tb22w\" (UID: \"b923e24a-92ce-4c4a-8c26-d4fe2b1563ad\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.734165 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.739910 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95z6m\" (UniqueName: \"kubernetes.io/projected/04dda440-ebd4-412b-9a18-655a9721229d-kube-api-access-95z6m\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.740779 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.741820 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.743599 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.767687 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.768829 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-d4wpq" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.796511 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-cert\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.796560 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5tx6\" (UniqueName: \"kubernetes.io/projected/d01da9fa-a63b-4496-bea1-37048e323618-kube-api-access-z5tx6\") pod \"telemetry-operator-controller-manager-578874c84d-2cvmc\" (UID: \"d01da9fa-a63b-4496-bea1-37048e323618\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.796613 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kfv\" (UniqueName: \"kubernetes.io/projected/9544ed2b-6308-4681-a120-d134ee029ded-kube-api-access-c2kfv\") pod \"swift-operator-controller-manager-5f4d5dfdc6-b5g5r\" (UID: \"9544ed2b-6308-4681-a120-d134ee029ded\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.802923 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3f904c2-4da1-46c2-83c6-2ba18d9ccc50-cert\") pod \"infra-operator-controller-manager-585fc5b659-rtgvd\" (UID: \"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.819874 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.828181 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kfv\" (UniqueName: \"kubernetes.io/projected/9544ed2b-6308-4681-a120-d134ee029ded-kube-api-access-c2kfv\") pod \"swift-operator-controller-manager-5f4d5dfdc6-b5g5r\" (UID: \"9544ed2b-6308-4681-a120-d134ee029ded\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.848848 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.862316 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.864151 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.866512 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qjvdx" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.883176 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.888224 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-hh26l"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.889944 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.892348 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-hh26l"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.893262 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wvj2d" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.899568 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5tx6\" (UniqueName: \"kubernetes.io/projected/d01da9fa-a63b-4496-bea1-37048e323618-kube-api-access-z5tx6\") pod \"telemetry-operator-controller-manager-578874c84d-2cvmc\" (UID: \"d01da9fa-a63b-4496-bea1-37048e323618\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.931896 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.933485 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.936472 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.936624 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-cflm5" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.939108 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5tx6\" (UniqueName: \"kubernetes.io/projected/d01da9fa-a63b-4496-bea1-37048e323618-kube-api-access-z5tx6\") pod \"telemetry-operator-controller-manager-578874c84d-2cvmc\" (UID: \"d01da9fa-a63b-4496-bea1-37048e323618\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.942214 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.947923 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.949113 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.951008 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wqrnc" Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.952680 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8"] Oct 10 06:40:40 crc kubenswrapper[4822]: I1010 06:40:40.999221 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.000675 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzjq\" (UniqueName: \"kubernetes.io/projected/b617d013-1412-4783-b71f-f3142cf15c35-kube-api-access-qjzjq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-glnm8\" (UID: \"b617d013-1412-4783-b71f-f3142cf15c35\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.000867 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljrkx\" (UniqueName: \"kubernetes.io/projected/7c2224b9-8bd4-4967-9e24-59ce223b2e0e-kube-api-access-ljrkx\") pod \"test-operator-controller-manager-ffcdd6c94-2vmqc\" (UID: \"7c2224b9-8bd4-4967-9e24-59ce223b2e0e\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.000995 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrrg\" (UniqueName: \"kubernetes.io/projected/57520796-d080-466b-9070-c4cd032ed8ab-kube-api-access-hcrrg\") pod \"watcher-operator-controller-manager-646675d848-hh26l\" (UID: \"57520796-d080-466b-9070-c4cd032ed8ab\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.001168 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5178ccba-ae40-49f5-9fba-6df6b0fbb562-cert\") pod \"openstack-operator-controller-manager-58fd854765-9cj6d\" (UID: \"5178ccba-ae40-49f5-9fba-6df6b0fbb562\") " pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.001322 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwwx\" (UniqueName: \"kubernetes.io/projected/5178ccba-ae40-49f5-9fba-6df6b0fbb562-kube-api-access-9rwwx\") pod \"openstack-operator-controller-manager-58fd854765-9cj6d\" (UID: \"5178ccba-ae40-49f5-9fba-6df6b0fbb562\") " pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.041041 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.042115 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.064196 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.074657 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.102593 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrkx\" (UniqueName: \"kubernetes.io/projected/7c2224b9-8bd4-4967-9e24-59ce223b2e0e-kube-api-access-ljrkx\") pod \"test-operator-controller-manager-ffcdd6c94-2vmqc\" (UID: \"7c2224b9-8bd4-4967-9e24-59ce223b2e0e\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.102656 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrrg\" (UniqueName: \"kubernetes.io/projected/57520796-d080-466b-9070-c4cd032ed8ab-kube-api-access-hcrrg\") pod \"watcher-operator-controller-manager-646675d848-hh26l\" (UID: \"57520796-d080-466b-9070-c4cd032ed8ab\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.102693 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5178ccba-ae40-49f5-9fba-6df6b0fbb562-cert\") pod \"openstack-operator-controller-manager-58fd854765-9cj6d\" (UID: \"5178ccba-ae40-49f5-9fba-6df6b0fbb562\") " pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.102756 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwwx\" (UniqueName: \"kubernetes.io/projected/5178ccba-ae40-49f5-9fba-6df6b0fbb562-kube-api-access-9rwwx\") pod \"openstack-operator-controller-manager-58fd854765-9cj6d\" (UID: \"5178ccba-ae40-49f5-9fba-6df6b0fbb562\") " pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.102780 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzjq\" (UniqueName: \"kubernetes.io/projected/b617d013-1412-4783-b71f-f3142cf15c35-kube-api-access-qjzjq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-glnm8\" (UID: \"b617d013-1412-4783-b71f-f3142cf15c35\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.108365 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5178ccba-ae40-49f5-9fba-6df6b0fbb562-cert\") pod \"openstack-operator-controller-manager-58fd854765-9cj6d\" (UID: \"5178ccba-ae40-49f5-9fba-6df6b0fbb562\") " pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.124773 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwwx\" (UniqueName: \"kubernetes.io/projected/5178ccba-ae40-49f5-9fba-6df6b0fbb562-kube-api-access-9rwwx\") pod \"openstack-operator-controller-manager-58fd854765-9cj6d\" (UID: \"5178ccba-ae40-49f5-9fba-6df6b0fbb562\") " pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.125444 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzjq\" (UniqueName: \"kubernetes.io/projected/b617d013-1412-4783-b71f-f3142cf15c35-kube-api-access-qjzjq\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-glnm8\" (UID: \"b617d013-1412-4783-b71f-f3142cf15c35\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.125773 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.133429 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrrg\" (UniqueName: \"kubernetes.io/projected/57520796-d080-466b-9070-c4cd032ed8ab-kube-api-access-hcrrg\") pod \"watcher-operator-controller-manager-646675d848-hh26l\" (UID: \"57520796-d080-466b-9070-c4cd032ed8ab\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.138610 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljrkx\" (UniqueName: \"kubernetes.io/projected/7c2224b9-8bd4-4967-9e24-59ce223b2e0e-kube-api-access-ljrkx\") pod \"test-operator-controller-manager-ffcdd6c94-2vmqc\" (UID: \"7c2224b9-8bd4-4967-9e24-59ce223b2e0e\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.197192 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.204106 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.204292 4822 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.204343 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert podName:04dda440-ebd4-412b-9a18-655a9721229d nodeName:}" failed. No retries permitted until 2025-10-10 06:40:42.204328077 +0000 UTC m=+989.299486273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" (UID: "04dda440-ebd4-412b-9a18-655a9721229d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.212655 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.248465 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.283975 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.304181 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.328551 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.463884 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55"] Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.465031 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod367aa79b_9342_431b_ade4_a9195844ce4a.slice/crio-9fdd6ef41d55391eb26544174e36109678939a70200c0df9a99311989dd477e9 WatchSource:0}: Error finding container 9fdd6ef41d55391eb26544174e36109678939a70200c0df9a99311989dd477e9: Status 404 returned error can't find the container with id 9fdd6ef41d55391eb26544174e36109678939a70200c0df9a99311989dd477e9 Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.485466 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.497692 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.506687 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.528305 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr"] Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.632521 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d454f08_e347_4e39_8392_9c5b4a2a8f6b.slice/crio-a666503cd9d674f2c80948475ea5fdadd163eed7c7ed4654637f45eaaa6edb8e WatchSource:0}: Error finding container a666503cd9d674f2c80948475ea5fdadd163eed7c7ed4654637f45eaaa6edb8e: Status 404 returned error can't find the container with id a666503cd9d674f2c80948475ea5fdadd163eed7c7ed4654637f45eaaa6edb8e Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.637939 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod111bbcd0_554c_4705_a874_1d3aa399a391.slice/crio-81a8ffa497026a6db96948478c3b9d50b1b1cefb15f4f9855e3de9a3e6825877 WatchSource:0}: Error finding container 81a8ffa497026a6db96948478c3b9d50b1b1cefb15f4f9855e3de9a3e6825877: Status 404 returned error can't find the container with id 81a8ffa497026a6db96948478c3b9d50b1b1cefb15f4f9855e3de9a3e6825877 Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.640567 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l"] Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.643769 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff50152_dd82_48ae_bca4_150c0f892185.slice/crio-fc7da59d3304c6c6c65d4d6d7a014a8efbe2eba8dda6018c138b3729b8c65f08 WatchSource:0}: Error finding container fc7da59d3304c6c6c65d4d6d7a014a8efbe2eba8dda6018c138b3729b8c65f08: Status 404 returned error can't find the container with id fc7da59d3304c6c6c65d4d6d7a014a8efbe2eba8dda6018c138b3729b8c65f08 Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.645630 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6"] Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.651442 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-24tjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-869cc7797f-tb22w_openstack-operators(b923e24a-92ce-4c4a-8c26-d4fe2b1563ad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.677330 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c66ad3b_7962_4747_97d4_a2c183d25ebc.slice/crio-a4c6a126f7d357246893a62896313bad9944f16124d9a285ae965c7982561bb7 WatchSource:0}: Error finding container a4c6a126f7d357246893a62896313bad9944f16124d9a285ae965c7982561bb7: Status 404 returned error can't find the container with id a4c6a126f7d357246893a62896313bad9944f16124d9a285ae965c7982561bb7 Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.682357 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.682388 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.682400 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.722594 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" event={"ID":"367aa79b-9342-431b-ade4-a9195844ce4a","Type":"ContainerStarted","Data":"9fdd6ef41d55391eb26544174e36109678939a70200c0df9a99311989dd477e9"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.724368 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" event={"ID":"6133aeb2-a9e5-4170-a6e1-b562cdb97975","Type":"ContainerStarted","Data":"3026239dc1e57530ea00c66755716f84af8705f35af4181d39f18b960ea5f3bd"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.725352 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" event={"ID":"8d454f08-e347-4e39-8392-9c5b4a2a8f6b","Type":"ContainerStarted","Data":"a666503cd9d674f2c80948475ea5fdadd163eed7c7ed4654637f45eaaa6edb8e"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.726549 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" event={"ID":"a21a0dcd-c8b2-4ea4-ab7e-edae527ab347","Type":"ContainerStarted","Data":"f512fbf4d223809c8190fe8aa8ef523b94d52fdca758a7d9aa72f5c4b04ebf93"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.728105 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" event={"ID":"e757a212-d95c-4ffc-ae84-ceca5cc56cc2","Type":"ContainerStarted","Data":"fbd3917f879be6ecd495c3de3e6d762864aa599a118e5088a805a55f02d5f83a"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.728846 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" event={"ID":"f56e0976-eb7a-4bcf-bde2-016c83567fc6","Type":"ContainerStarted","Data":"4b48b379d56a814908a982ac5413131df4b72a2fc3255afa20859483a87d6c2a"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.730694 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" event={"ID":"49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42","Type":"ContainerStarted","Data":"3e8b163b30ed91d4ba8c0caf39e662465ba656000885dda4230e0a2443c85665"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.733245 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" event={"ID":"1ff50152-dd82-48ae-bca4-150c0f892185","Type":"ContainerStarted","Data":"fc7da59d3304c6c6c65d4d6d7a014a8efbe2eba8dda6018c138b3729b8c65f08"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.734279 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" event={"ID":"3e087283-f802-4b9c-9f1f-bbca4e30a892","Type":"ContainerStarted","Data":"499914ff6ce2b03a3b1045b23f26e7c0078eb688812216ea8b1a2e27ba226001"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.735394 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" event={"ID":"8c66ad3b-7962-4747-97d4-a2c183d25ebc","Type":"ContainerStarted","Data":"a4c6a126f7d357246893a62896313bad9944f16124d9a285ae965c7982561bb7"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.736169 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" event={"ID":"9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8","Type":"ContainerStarted","Data":"6d06278325d57bc7c2a12158806ea4014997f6d0d415310d79ebba6f5cc5089f"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.737322 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" event={"ID":"be44c9df-65d4-4a6a-8646-7687f601f6b6","Type":"ContainerStarted","Data":"a751a11e7b158560bc64937bcdeae62304bb4aa1d1a58aa5d2e82ab95ca64a97"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.738346 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" event={"ID":"111bbcd0-554c-4705-a874-1d3aa399a391","Type":"ContainerStarted","Data":"81a8ffa497026a6db96948478c3b9d50b1b1cefb15f4f9855e3de9a3e6825877"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.741636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" event={"ID":"b923e24a-92ce-4c4a-8c26-d4fe2b1563ad","Type":"ContainerStarted","Data":"5939935893c5dc80bbe6e93ee35a6eeb8a2be45a1c10cc8866cbe81b9e564a48"} Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.772983 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5"] Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.779084 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ff8gc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-664664cb68-9q6b5_openstack-operators(0da053a2-94d4-41de-87a8-d7f2662d9b5b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.820329 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc"] Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.821055 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c2224b9_8bd4_4967_9e24_59ce223b2e0e.slice/crio-f6db72f16e5cfcd905b28d19ffad4297c6661fa49ebb4e5f72a58764c136f509 WatchSource:0}: Error finding container f6db72f16e5cfcd905b28d19ffad4297c6661fa49ebb4e5f72a58764c136f509: Status 404 returned error can't find the container with id f6db72f16e5cfcd905b28d19ffad4297c6661fa49ebb4e5f72a58764c136f509 Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.823023 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" podUID="b923e24a-92ce-4c4a-8c26-d4fe2b1563ad" Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.825628 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ljrkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-2vmqc_openstack-operators(7c2224b9-8bd4-4967-9e24-59ce223b2e0e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.832930 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r"] Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.837297 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9544ed2b_6308_4681_a120_d134ee029ded.slice/crio-bce6354274f259143c2533085f54a0e1bc2c7895c10466cc630491510e3803ce WatchSource:0}: Error finding container bce6354274f259143c2533085f54a0e1bc2c7895c10466cc630491510e3803ce: Status 404 returned error can't find the container with id bce6354274f259143c2533085f54a0e1bc2c7895c10466cc630491510e3803ce Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.858692 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c2kfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f4d5dfdc6-b5g5r_openstack-operators(9544ed2b-6308-4681-a120-d134ee029ded): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.942067 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" podUID="0da053a2-94d4-41de-87a8-d7f2662d9b5b" Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.953348 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.961869 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc"] Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.967929 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd"] Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.968431 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5178ccba_ae40_49f5_9fba_6df6b0fbb562.slice/crio-ad9716bed773769aad6d8a3dd8f872179a61af737bdc65d61431e15b244741c5 WatchSource:0}: Error finding container ad9716bed773769aad6d8a3dd8f872179a61af737bdc65d61431e15b244741c5: Status 404 returned error can't find the container with id ad9716bed773769aad6d8a3dd8f872179a61af737bdc65d61431e15b244741c5 Oct 10 06:40:41 crc kubenswrapper[4822]: I1010 06:40:41.975461 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-hh26l"] Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.977050 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z5tx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-578874c84d-2cvmc_openstack-operators(d01da9fa-a63b-4496-bea1-37048e323618): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 06:40:41 crc kubenswrapper[4822]: W1010 06:40:41.979709 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57520796_d080_466b_9070_c4cd032ed8ab.slice/crio-1b83e4cdbdf92803ec5c975ce72775017ea1ba3cadc04d23b3a0b46862a40dd7 WatchSource:0}: Error finding container 1b83e4cdbdf92803ec5c975ce72775017ea1ba3cadc04d23b3a0b46862a40dd7: Status 404 returned error can't find the container with id 1b83e4cdbdf92803ec5c975ce72775017ea1ba3cadc04d23b3a0b46862a40dd7 Oct 10 06:40:41 crc kubenswrapper[4822]: E1010 06:40:41.981542 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-646675d848-hh26l_openstack-operators(57520796-d080-466b-9070-c4cd032ed8ab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.008936 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" podUID="9544ed2b-6308-4681-a120-d134ee029ded" Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.009856 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" podUID="7c2224b9-8bd4-4967-9e24-59ce223b2e0e" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.086941 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8"] Oct 10 06:40:42 crc kubenswrapper[4822]: W1010 06:40:42.120670 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb617d013_1412_4783_b71f_f3142cf15c35.slice/crio-e2fba43aa34379853ec08963e741808d34caa9dd676ac6d0b7709f0ea34fcef1 WatchSource:0}: Error finding container e2fba43aa34379853ec08963e741808d34caa9dd676ac6d0b7709f0ea34fcef1: Status 404 returned error can't find the container with id e2fba43aa34379853ec08963e741808d34caa9dd676ac6d0b7709f0ea34fcef1 Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.224811 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.230687 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04dda440-ebd4-412b-9a18-655a9721229d-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74\" (UID: \"04dda440-ebd4-412b-9a18-655a9721229d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.287101 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" podUID="d01da9fa-a63b-4496-bea1-37048e323618" Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.297159 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" podUID="57520796-d080-466b-9070-c4cd032ed8ab" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.392031 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.767338 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" event={"ID":"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50","Type":"ContainerStarted","Data":"a12e0c819227da91531a1da9c9b75fb455797d95cf87d8dafd36ed15fbf7f5cb"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.774524 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" event={"ID":"0da053a2-94d4-41de-87a8-d7f2662d9b5b","Type":"ContainerStarted","Data":"cc63dc9ffe4a062470ee265a83a176340a8c592e67349a55b17808f4a5f9a117"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.774589 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" event={"ID":"0da053a2-94d4-41de-87a8-d7f2662d9b5b","Type":"ContainerStarted","Data":"1ac950f20ac5c5b38643648fb664d6b250a12930c6f07c36b19e453e50ebf2a9"} Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.780787 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" podUID="0da053a2-94d4-41de-87a8-d7f2662d9b5b" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.797880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" event={"ID":"b617d013-1412-4783-b71f-f3142cf15c35","Type":"ContainerStarted","Data":"e2fba43aa34379853ec08963e741808d34caa9dd676ac6d0b7709f0ea34fcef1"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.812378 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" event={"ID":"b923e24a-92ce-4c4a-8c26-d4fe2b1563ad","Type":"ContainerStarted","Data":"37ed4e5507a35018a8e559ffa8161ba5f87eceb796a86d151aa2595889a2759a"} Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.813781 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" podUID="b923e24a-92ce-4c4a-8c26-d4fe2b1563ad" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.819059 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" event={"ID":"7c2224b9-8bd4-4967-9e24-59ce223b2e0e","Type":"ContainerStarted","Data":"a46ba009c8f687c26846e137f3c1ed80c68a51381fa2789d3af8e8ef98d92fb0"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.819114 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" event={"ID":"7c2224b9-8bd4-4967-9e24-59ce223b2e0e","Type":"ContainerStarted","Data":"f6db72f16e5cfcd905b28d19ffad4297c6661fa49ebb4e5f72a58764c136f509"} Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.832304 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" podUID="7c2224b9-8bd4-4967-9e24-59ce223b2e0e" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.838963 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" event={"ID":"d01da9fa-a63b-4496-bea1-37048e323618","Type":"ContainerStarted","Data":"b12a120fe3a7b56d75bc68cc53c9ad1096d8f37467f4b923fe295f6fda586afb"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.839002 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" event={"ID":"d01da9fa-a63b-4496-bea1-37048e323618","Type":"ContainerStarted","Data":"4a74fd5748cca5b63df5036e0b305c32d6de45ee6258d18e5d36a629b0065de2"} Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.840512 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" podUID="d01da9fa-a63b-4496-bea1-37048e323618" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.846558 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" event={"ID":"5178ccba-ae40-49f5-9fba-6df6b0fbb562","Type":"ContainerStarted","Data":"be71402fd660cfeca9bcd29998f66179f5be5ad35a1fb08610abbef2c6245486"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.846596 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" event={"ID":"5178ccba-ae40-49f5-9fba-6df6b0fbb562","Type":"ContainerStarted","Data":"27188d551fa8ea9fbab2cf6781f82720e6514960d8c60d266d7a0d19a5951a81"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.846608 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" event={"ID":"5178ccba-ae40-49f5-9fba-6df6b0fbb562","Type":"ContainerStarted","Data":"ad9716bed773769aad6d8a3dd8f872179a61af737bdc65d61431e15b244741c5"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.846713 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.848394 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" event={"ID":"9544ed2b-6308-4681-a120-d134ee029ded","Type":"ContainerStarted","Data":"08ab38b36f7ad9b90e236c98483ca2e7f8e6da0ded74b4b8748d4614143e7642"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.848427 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" event={"ID":"9544ed2b-6308-4681-a120-d134ee029ded","Type":"ContainerStarted","Data":"bce6354274f259143c2533085f54a0e1bc2c7895c10466cc630491510e3803ce"} Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.849482 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" podUID="9544ed2b-6308-4681-a120-d134ee029ded" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.849914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" event={"ID":"57520796-d080-466b-9070-c4cd032ed8ab","Type":"ContainerStarted","Data":"a4db196cd6b1160a90aca5c3efb014741a82f9044939a7cb33092a7c1901bb69"} Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.849940 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" event={"ID":"57520796-d080-466b-9070-c4cd032ed8ab","Type":"ContainerStarted","Data":"1b83e4cdbdf92803ec5c975ce72775017ea1ba3cadc04d23b3a0b46862a40dd7"} Oct 10 06:40:42 crc kubenswrapper[4822]: E1010 06:40:42.850998 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" podUID="57520796-d080-466b-9070-c4cd032ed8ab" Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.851703 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74"] Oct 10 06:40:42 crc kubenswrapper[4822]: W1010 06:40:42.868344 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04dda440_ebd4_412b_9a18_655a9721229d.slice/crio-0caeff8a087d53fdd27dd683bf45ed260e219d48e64f576e222c41e6e70d49b6 WatchSource:0}: Error finding container 0caeff8a087d53fdd27dd683bf45ed260e219d48e64f576e222c41e6e70d49b6: Status 404 returned error can't find the container with id 0caeff8a087d53fdd27dd683bf45ed260e219d48e64f576e222c41e6e70d49b6 Oct 10 06:40:42 crc kubenswrapper[4822]: I1010 06:40:42.910130 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" podStartSLOduration=2.9101080440000002 podStartE2EDuration="2.910108044s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:40:42.908022454 +0000 UTC m=+990.003180660" watchObservedRunningTime="2025-10-10 06:40:42.910108044 +0000 UTC m=+990.005266240" Oct 10 06:40:43 crc kubenswrapper[4822]: I1010 06:40:43.863467 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" event={"ID":"04dda440-ebd4-412b-9a18-655a9721229d","Type":"ContainerStarted","Data":"0caeff8a087d53fdd27dd683bf45ed260e219d48e64f576e222c41e6e70d49b6"} Oct 10 06:40:43 crc kubenswrapper[4822]: E1010 06:40:43.864991 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" podUID="0da053a2-94d4-41de-87a8-d7f2662d9b5b" Oct 10 06:40:43 crc kubenswrapper[4822]: E1010 06:40:43.865191 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" podUID="d01da9fa-a63b-4496-bea1-37048e323618" Oct 10 06:40:43 crc kubenswrapper[4822]: E1010 06:40:43.865247 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" podUID="57520796-d080-466b-9070-c4cd032ed8ab" Oct 10 06:40:43 crc kubenswrapper[4822]: E1010 06:40:43.865997 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" podUID="b923e24a-92ce-4c4a-8c26-d4fe2b1563ad" Oct 10 06:40:43 crc kubenswrapper[4822]: E1010 06:40:43.866060 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" podUID="7c2224b9-8bd4-4967-9e24-59ce223b2e0e" Oct 10 06:40:43 crc kubenswrapper[4822]: E1010 06:40:43.872117 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" podUID="9544ed2b-6308-4681-a120-d134ee029ded" Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.311147 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-58fd854765-9cj6d" Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.918609 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" event={"ID":"111bbcd0-554c-4705-a874-1d3aa399a391","Type":"ContainerStarted","Data":"b6bd89d6fa8a07d4f786e568c9398575359e141e54f122c243fd0206dd8660f7"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.919998 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" event={"ID":"1ff50152-dd82-48ae-bca4-150c0f892185","Type":"ContainerStarted","Data":"58f3bff0431dd42be3631d12757e77e411b1a38ade404467f3f3cb5b5837e390"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.920788 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" event={"ID":"e757a212-d95c-4ffc-ae84-ceca5cc56cc2","Type":"ContainerStarted","Data":"bd13b4f4d4bd5d9813db2b64cd9b30185be212c90be216d62ad9eac57c912c5c"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.921665 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" event={"ID":"8c66ad3b-7962-4747-97d4-a2c183d25ebc","Type":"ContainerStarted","Data":"e11d1db5d4cbdac9b05073cc231960cb07c0729fefa95b0557ed547cf94651cf"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.922465 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" event={"ID":"9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8","Type":"ContainerStarted","Data":"d0c009d3a88a8a43553425cdb6953a7b5c7341590492beba0bb6d8ebda1fe243"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.923402 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" event={"ID":"be44c9df-65d4-4a6a-8646-7687f601f6b6","Type":"ContainerStarted","Data":"4ded86781091d0d21150d860421454423ab20cb82ba630450c82caa11ef65fbb"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.940933 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" event={"ID":"04dda440-ebd4-412b-9a18-655a9721229d","Type":"ContainerStarted","Data":"862caec33ea03dc565296cdc8d4ff6241faf13d337df7f7156d9ea60827dd936"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.947110 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" event={"ID":"3e087283-f802-4b9c-9f1f-bbca4e30a892","Type":"ContainerStarted","Data":"5d3279fab8798fb6a6e2fb4f8e26cbd6b7616534b922247bb14547c5e81296be"} Oct 10 06:40:51 crc kubenswrapper[4822]: I1010 06:40:51.953289 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" event={"ID":"6133aeb2-a9e5-4170-a6e1-b562cdb97975","Type":"ContainerStarted","Data":"a1ed65dcaa471ab79173020e5b771f3c2e5a7d8870ebf2df41bf59eecd9ba0f8"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:51.966715 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" event={"ID":"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50","Type":"ContainerStarted","Data":"2899a5ced093ecc2e15f8313eb7475193329d2bb13ec35d5366891592f0bd418"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:51.971834 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" event={"ID":"f56e0976-eb7a-4bcf-bde2-016c83567fc6","Type":"ContainerStarted","Data":"d5777895b44078dec66d1bb288bac72b20345b17c4aa5651e2889997bf334f26"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:51.973191 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" event={"ID":"49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42","Type":"ContainerStarted","Data":"7aeeb361d5919fd9e3eee4ca5ed26609b4cecba3354091c4a57705064026bca3"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:52.027979 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" event={"ID":"367aa79b-9342-431b-ade4-a9195844ce4a","Type":"ContainerStarted","Data":"1e09c173a58e56149ea1733ce605446709f355e474f0b8dc09cd7eb0726525cc"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:52.040236 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" event={"ID":"a21a0dcd-c8b2-4ea4-ab7e-edae527ab347","Type":"ContainerStarted","Data":"34336492f66459910faa724f0c5af937e14cd41548e7c10207706e613361042f"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:52.040285 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" event={"ID":"a21a0dcd-c8b2-4ea4-ab7e-edae527ab347","Type":"ContainerStarted","Data":"357f61bf71fd9457db2146fd7d475710f8d25d72599d3fb006f5ec5491a42e47"} Oct 10 06:40:52 crc kubenswrapper[4822]: I1010 06:40:52.041136 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.048403 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" event={"ID":"f56e0976-eb7a-4bcf-bde2-016c83567fc6","Type":"ContainerStarted","Data":"9fc84f202a301165aa9bfe38df8614ad166fa498e1da2f2fe5107ee24dd448b7"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.048551 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.050269 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" event={"ID":"b617d013-1412-4783-b71f-f3142cf15c35","Type":"ContainerStarted","Data":"71a43bb9fff8dfcadebac5dbbc200cafdc992a38fc53309007b8d33de9de59c6"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.052365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" event={"ID":"f3f904c2-4da1-46c2-83c6-2ba18d9ccc50","Type":"ContainerStarted","Data":"4555c06b15b01c156de5bb78c4b82108f7f4d9d57d8ff7bb930ad6153ea89b52"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.052491 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.054143 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" event={"ID":"be44c9df-65d4-4a6a-8646-7687f601f6b6","Type":"ContainerStarted","Data":"65f260bfa314bbf0a9936bae4fc9b5083db429947952dfcc392f257229df90a7"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.054295 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.055533 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" event={"ID":"111bbcd0-554c-4705-a874-1d3aa399a391","Type":"ContainerStarted","Data":"dd76a88be7020b338317834493d154284a41ca4ca141bbd3d7230016314f620a"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.055656 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.056991 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" event={"ID":"04dda440-ebd4-412b-9a18-655a9721229d","Type":"ContainerStarted","Data":"a56bcb8a2fcc648f7e8a816eb3de9dac30e34bdfcd44bf2a4e655f47757cda7b"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.057117 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.058533 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" event={"ID":"49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42","Type":"ContainerStarted","Data":"b2b74d368c5342a7ff276ae6776f1c8795e830d6da334219e63c3c311be00ae3"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.058589 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.060462 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" event={"ID":"367aa79b-9342-431b-ade4-a9195844ce4a","Type":"ContainerStarted","Data":"362926e55f0d0eebaeab82543c560933a27a13720e9bff2858ced075acc1bd46"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.060513 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.065015 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" event={"ID":"6133aeb2-a9e5-4170-a6e1-b562cdb97975","Type":"ContainerStarted","Data":"95ecd85c3e1d4ccfbc47e7b65b67ca0c0914013d85e1ade7a83d51ba61c18e22"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.065149 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.065896 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" podStartSLOduration=4.141118222 podStartE2EDuration="14.065883595s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.06049672 +0000 UTC m=+988.155654916" lastFinishedPulling="2025-10-10 06:40:50.985262103 +0000 UTC m=+998.080420289" observedRunningTime="2025-10-10 06:40:52.072838765 +0000 UTC m=+999.167996961" watchObservedRunningTime="2025-10-10 06:40:53.065883595 +0000 UTC m=+1000.161041791" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.067404 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" podStartSLOduration=4.589731178 podStartE2EDuration="14.067397709s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.53660984 +0000 UTC m=+988.631768036" lastFinishedPulling="2025-10-10 06:40:51.014276371 +0000 UTC m=+998.109434567" observedRunningTime="2025-10-10 06:40:53.06157161 +0000 UTC m=+1000.156729806" watchObservedRunningTime="2025-10-10 06:40:53.067397709 +0000 UTC m=+1000.162555905" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.071150 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" event={"ID":"e757a212-d95c-4ffc-ae84-ceca5cc56cc2","Type":"ContainerStarted","Data":"9a7d2f00f732086013e570a849c79a7894e18d11e179a0c31b225f7f8014986e"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.071273 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.072983 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" event={"ID":"8c66ad3b-7962-4747-97d4-a2c183d25ebc","Type":"ContainerStarted","Data":"df84d8d9efdd54c69425d7591a1b26579c2b58c24cbc6f24ca79e6e1e399bedd"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.074071 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.075746 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" event={"ID":"1ff50152-dd82-48ae-bca4-150c0f892185","Type":"ContainerStarted","Data":"31399622eab625ecf9953f9bcafbd78742e8b9db25a56de872a4cc6b61e877de"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.076230 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.078550 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" event={"ID":"8d454f08-e347-4e39-8392-9c5b4a2a8f6b","Type":"ContainerStarted","Data":"19b6a5f391d0e352d8a8817d4397e9353187183c83bcb4ae2db2888a9a4c05aa"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.078611 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" event={"ID":"8d454f08-e347-4e39-8392-9c5b4a2a8f6b","Type":"ContainerStarted","Data":"5427f8e11167d852cf4908547f7581813c171a72b7819fc87b47d5a403c4909f"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.078736 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.080243 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" event={"ID":"3e087283-f802-4b9c-9f1f-bbca4e30a892","Type":"ContainerStarted","Data":"3b4e20105db1cf4d27638ca99db0db5ca96f28ebeaa6b116790f91d44a2da7b3"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.080957 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.083871 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" event={"ID":"9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8","Type":"ContainerStarted","Data":"b84212966df2b7a29a97dd77d2a08571d6e76644ac6f564cd8a8f9f92ba2565d"} Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.083977 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.100180 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" podStartSLOduration=5.015197534 podStartE2EDuration="13.100162046s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:42.896672936 +0000 UTC m=+989.991831132" lastFinishedPulling="2025-10-10 06:40:50.981637428 +0000 UTC m=+998.076795644" observedRunningTime="2025-10-10 06:40:53.095331626 +0000 UTC m=+1000.190489832" watchObservedRunningTime="2025-10-10 06:40:53.100162046 +0000 UTC m=+1000.195320242" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.131511 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" podStartSLOduration=3.6023019 podStartE2EDuration="13.13149123s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.54318548 +0000 UTC m=+988.638343676" lastFinishedPulling="2025-10-10 06:40:51.07237481 +0000 UTC m=+998.167533006" observedRunningTime="2025-10-10 06:40:53.130472351 +0000 UTC m=+1000.225630567" watchObservedRunningTime="2025-10-10 06:40:53.13149123 +0000 UTC m=+1000.226649426" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.159472 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" podStartSLOduration=3.778591804 podStartE2EDuration="13.159454568s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.639896225 +0000 UTC m=+988.735054421" lastFinishedPulling="2025-10-10 06:40:51.020758989 +0000 UTC m=+998.115917185" observedRunningTime="2025-10-10 06:40:53.144574528 +0000 UTC m=+1000.239732724" watchObservedRunningTime="2025-10-10 06:40:53.159454568 +0000 UTC m=+1000.254612764" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.160955 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" podStartSLOduration=4.212195875 podStartE2EDuration="14.160948971s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.060524621 +0000 UTC m=+988.155682817" lastFinishedPulling="2025-10-10 06:40:51.009277717 +0000 UTC m=+998.104435913" observedRunningTime="2025-10-10 06:40:53.159336355 +0000 UTC m=+1000.254494561" watchObservedRunningTime="2025-10-10 06:40:53.160948971 +0000 UTC m=+1000.256107157" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.173508 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-glnm8" podStartSLOduration=4.219537939 podStartE2EDuration="13.173491774s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:42.124509931 +0000 UTC m=+989.219668127" lastFinishedPulling="2025-10-10 06:40:51.078463776 +0000 UTC m=+998.173621962" observedRunningTime="2025-10-10 06:40:53.17126827 +0000 UTC m=+1000.266426456" watchObservedRunningTime="2025-10-10 06:40:53.173491774 +0000 UTC m=+1000.268649970" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.199671 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" podStartSLOduration=4.655926869 podStartE2EDuration="14.19965104s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.480979672 +0000 UTC m=+988.576137868" lastFinishedPulling="2025-10-10 06:40:51.024703843 +0000 UTC m=+998.119862039" observedRunningTime="2025-10-10 06:40:53.19583226 +0000 UTC m=+1000.290990486" watchObservedRunningTime="2025-10-10 06:40:53.19965104 +0000 UTC m=+1000.294809236" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.233228 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" podStartSLOduration=5.133642806 podStartE2EDuration="14.23320887s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.976873124 +0000 UTC m=+989.072031320" lastFinishedPulling="2025-10-10 06:40:51.076439198 +0000 UTC m=+998.171597384" observedRunningTime="2025-10-10 06:40:53.227200546 +0000 UTC m=+1000.322358742" watchObservedRunningTime="2025-10-10 06:40:53.23320887 +0000 UTC m=+1000.328367056" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.251229 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" podStartSLOduration=3.828046414 podStartE2EDuration="13.25121033s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.638023531 +0000 UTC m=+988.733181727" lastFinishedPulling="2025-10-10 06:40:51.061187447 +0000 UTC m=+998.156345643" observedRunningTime="2025-10-10 06:40:53.248880563 +0000 UTC m=+1000.344038769" watchObservedRunningTime="2025-10-10 06:40:53.25121033 +0000 UTC m=+1000.346368526" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.265000 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" podStartSLOduration=4.751361448 podStartE2EDuration="14.264980948s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.506916072 +0000 UTC m=+988.602074268" lastFinishedPulling="2025-10-10 06:40:51.020535572 +0000 UTC m=+998.115693768" observedRunningTime="2025-10-10 06:40:53.262360782 +0000 UTC m=+1000.357518988" watchObservedRunningTime="2025-10-10 06:40:53.264980948 +0000 UTC m=+1000.360139154" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.287612 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" podStartSLOduration=3.921901306 podStartE2EDuration="13.287585951s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.69266124 +0000 UTC m=+988.787819436" lastFinishedPulling="2025-10-10 06:40:51.058345875 +0000 UTC m=+998.153504081" observedRunningTime="2025-10-10 06:40:53.283988167 +0000 UTC m=+1000.379146393" watchObservedRunningTime="2025-10-10 06:40:53.287585951 +0000 UTC m=+1000.382744157" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.311871 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" podStartSLOduration=3.80506364 podStartE2EDuration="13.311848853s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.53937952 +0000 UTC m=+988.634537716" lastFinishedPulling="2025-10-10 06:40:51.046164743 +0000 UTC m=+998.141322929" observedRunningTime="2025-10-10 06:40:53.308699012 +0000 UTC m=+1000.403857218" watchObservedRunningTime="2025-10-10 06:40:53.311848853 +0000 UTC m=+1000.407007049" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.329453 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" podStartSLOduration=4.576991459 podStartE2EDuration="14.329435461s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.265982629 +0000 UTC m=+988.361140825" lastFinishedPulling="2025-10-10 06:40:51.018426631 +0000 UTC m=+998.113584827" observedRunningTime="2025-10-10 06:40:53.32560078 +0000 UTC m=+1000.420758976" watchObservedRunningTime="2025-10-10 06:40:53.329435461 +0000 UTC m=+1000.424593657" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.345543 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" podStartSLOduration=3.934976044 podStartE2EDuration="13.345521676s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.64734209 +0000 UTC m=+988.742500276" lastFinishedPulling="2025-10-10 06:40:51.057887712 +0000 UTC m=+998.153045908" observedRunningTime="2025-10-10 06:40:53.34429225 +0000 UTC m=+1000.439450446" watchObservedRunningTime="2025-10-10 06:40:53.345521676 +0000 UTC m=+1000.440679872" Oct 10 06:40:53 crc kubenswrapper[4822]: I1010 06:40:53.365136 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" podStartSLOduration=4.551232813 podStartE2EDuration="14.365112062s" podCreationTimestamp="2025-10-10 06:40:39 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.250301875 +0000 UTC m=+988.345460071" lastFinishedPulling="2025-10-10 06:40:51.064181124 +0000 UTC m=+998.159339320" observedRunningTime="2025-10-10 06:40:53.359793988 +0000 UTC m=+1000.454952194" watchObservedRunningTime="2025-10-10 06:40:53.365112062 +0000 UTC m=+1000.460270258" Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.122948 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" event={"ID":"9544ed2b-6308-4681-a120-d134ee029ded","Type":"ContainerStarted","Data":"b0713fcf2593bbfdb87895e328013b06d96fad29ee3cb2c49c2592d90cc7bf8f"} Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.123673 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.124601 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" event={"ID":"0da053a2-94d4-41de-87a8-d7f2662d9b5b","Type":"ContainerStarted","Data":"084f0cfe5e33af4b5d8f1366e93dfdd6bb8c73363b370f0fbe42dc5b9ddcb07d"} Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.124970 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.126680 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" event={"ID":"b923e24a-92ce-4c4a-8c26-d4fe2b1563ad","Type":"ContainerStarted","Data":"0eedd1369cf91b41ca05b793457a20a3f55fc0adc5661ff32c04d58164c860c3"} Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.126844 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.143824 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" podStartSLOduration=2.116278554 podStartE2EDuration="18.143787419s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.858524614 +0000 UTC m=+988.953682810" lastFinishedPulling="2025-10-10 06:40:57.886033489 +0000 UTC m=+1004.981191675" observedRunningTime="2025-10-10 06:40:58.142532442 +0000 UTC m=+1005.237690658" watchObservedRunningTime="2025-10-10 06:40:58.143787419 +0000 UTC m=+1005.238945615" Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.159827 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" podStartSLOduration=1.9251882 podStartE2EDuration="18.159789881s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.651278564 +0000 UTC m=+988.746436760" lastFinishedPulling="2025-10-10 06:40:57.885880225 +0000 UTC m=+1004.981038441" observedRunningTime="2025-10-10 06:40:58.157446133 +0000 UTC m=+1005.252604329" watchObservedRunningTime="2025-10-10 06:40:58.159789881 +0000 UTC m=+1005.254948077" Oct 10 06:40:58 crc kubenswrapper[4822]: I1010 06:40:58.178833 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" podStartSLOduration=2.065966198 podStartE2EDuration="18.178817541s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.778968134 +0000 UTC m=+988.874126330" lastFinishedPulling="2025-10-10 06:40:57.891819477 +0000 UTC m=+1004.986977673" observedRunningTime="2025-10-10 06:40:58.173108446 +0000 UTC m=+1005.268266662" watchObservedRunningTime="2025-10-10 06:40:58.178817541 +0000 UTC m=+1005.273975737" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.148442 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" event={"ID":"d01da9fa-a63b-4496-bea1-37048e323618","Type":"ContainerStarted","Data":"76985cf5ec0e2d062c04a312e534baea0c7dc0ddd08f0352029a87eeef40057a"} Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.149042 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.173774 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" podStartSLOduration=2.630191905 podStartE2EDuration="20.173747836s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.976882824 +0000 UTC m=+989.072041020" lastFinishedPulling="2025-10-10 06:40:59.520438715 +0000 UTC m=+1006.615596951" observedRunningTime="2025-10-10 06:41:00.169006789 +0000 UTC m=+1007.264165015" watchObservedRunningTime="2025-10-10 06:41:00.173747836 +0000 UTC m=+1007.268906032" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.192953 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-wpwlg" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.199993 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-2h5rm" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.218874 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-j9ksb" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.255000 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-76qx8" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.285907 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-grq55" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.371368 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-gmgqr" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.567709 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-tbj65" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.596512 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-4h8gw" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.620573 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-2fwqj" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.653574 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-dt47l" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.679157 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-rvhj6" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.746256 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-k8fww" Oct 10 06:41:00 crc kubenswrapper[4822]: I1010 06:41:00.774819 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-7d77l" Oct 10 06:41:01 crc kubenswrapper[4822]: I1010 06:41:01.084301 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rtgvd" Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.165461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" event={"ID":"57520796-d080-466b-9070-c4cd032ed8ab","Type":"ContainerStarted","Data":"1deda13460ca9a3eac655f081c66739ed1f7702d604d0152bf3c528d1b99c7b7"} Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.166445 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.168005 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" event={"ID":"7c2224b9-8bd4-4967-9e24-59ce223b2e0e","Type":"ContainerStarted","Data":"6571980315d17878e754c90614d2d73db9498e5569648278f8437b48f560720a"} Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.168270 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.188220 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" podStartSLOduration=2.550102321 podStartE2EDuration="22.188204765s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.981441366 +0000 UTC m=+989.076599562" lastFinishedPulling="2025-10-10 06:41:01.61954379 +0000 UTC m=+1008.714702006" observedRunningTime="2025-10-10 06:41:02.186106205 +0000 UTC m=+1009.281264411" watchObservedRunningTime="2025-10-10 06:41:02.188204765 +0000 UTC m=+1009.283362961" Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.214390 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" podStartSLOduration=2.420767983 podStartE2EDuration="22.214370582s" podCreationTimestamp="2025-10-10 06:40:40 +0000 UTC" firstStartedPulling="2025-10-10 06:40:41.825473478 +0000 UTC m=+988.920631684" lastFinishedPulling="2025-10-10 06:41:01.619076087 +0000 UTC m=+1008.714234283" observedRunningTime="2025-10-10 06:41:02.21155681 +0000 UTC m=+1009.306715016" watchObservedRunningTime="2025-10-10 06:41:02.214370582 +0000 UTC m=+1009.309528778" Oct 10 06:41:02 crc kubenswrapper[4822]: I1010 06:41:02.397232 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74" Oct 10 06:41:10 crc kubenswrapper[4822]: I1010 06:41:10.737247 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-tb22w" Oct 10 06:41:11 crc kubenswrapper[4822]: I1010 06:41:11.002027 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-9q6b5" Oct 10 06:41:11 crc kubenswrapper[4822]: I1010 06:41:11.068386 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-b5g5r" Oct 10 06:41:11 crc kubenswrapper[4822]: I1010 06:41:11.132898 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2cvmc" Oct 10 06:41:11 crc kubenswrapper[4822]: I1010 06:41:11.251070 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-2vmqc" Oct 10 06:41:11 crc kubenswrapper[4822]: I1010 06:41:11.320642 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-hh26l" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.590291 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8pjkd"] Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.592342 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.593832 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2cbhh" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.594192 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.594433 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.594587 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.632709 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503600d6-3ea7-4309-a1a1-9d412b78dd6a-config\") pod \"dnsmasq-dns-675f4bcbfc-8pjkd\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.632781 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrw9b\" (UniqueName: \"kubernetes.io/projected/503600d6-3ea7-4309-a1a1-9d412b78dd6a-kube-api-access-rrw9b\") pod \"dnsmasq-dns-675f4bcbfc-8pjkd\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.648436 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8pjkd"] Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.682757 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zd66g"] Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.684343 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.686435 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.698576 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zd66g"] Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.733836 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503600d6-3ea7-4309-a1a1-9d412b78dd6a-config\") pod \"dnsmasq-dns-675f4bcbfc-8pjkd\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.733888 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.733929 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrw9b\" (UniqueName: \"kubernetes.io/projected/503600d6-3ea7-4309-a1a1-9d412b78dd6a-kube-api-access-rrw9b\") pod \"dnsmasq-dns-675f4bcbfc-8pjkd\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.733979 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk6pl\" (UniqueName: \"kubernetes.io/projected/d282c7c5-dac8-407d-a65c-7b3532cbbaff-kube-api-access-sk6pl\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.734034 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-config\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.734763 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503600d6-3ea7-4309-a1a1-9d412b78dd6a-config\") pod \"dnsmasq-dns-675f4bcbfc-8pjkd\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.761472 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrw9b\" (UniqueName: \"kubernetes.io/projected/503600d6-3ea7-4309-a1a1-9d412b78dd6a-kube-api-access-rrw9b\") pod \"dnsmasq-dns-675f4bcbfc-8pjkd\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.835062 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.835137 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk6pl\" (UniqueName: \"kubernetes.io/projected/d282c7c5-dac8-407d-a65c-7b3532cbbaff-kube-api-access-sk6pl\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.835170 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-config\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.835937 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-config\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.836417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.861526 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk6pl\" (UniqueName: \"kubernetes.io/projected/d282c7c5-dac8-407d-a65c-7b3532cbbaff-kube-api-access-sk6pl\") pod \"dnsmasq-dns-78dd6ddcc-zd66g\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:24 crc kubenswrapper[4822]: I1010 06:41:24.911917 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:25 crc kubenswrapper[4822]: I1010 06:41:24.999844 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:25 crc kubenswrapper[4822]: I1010 06:41:25.444004 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8pjkd"] Oct 10 06:41:25 crc kubenswrapper[4822]: W1010 06:41:25.451151 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod503600d6_3ea7_4309_a1a1_9d412b78dd6a.slice/crio-fc502cde41cf2513cf9950c47b47065a45736b0dead9cb967898079c3960c0ab WatchSource:0}: Error finding container fc502cde41cf2513cf9950c47b47065a45736b0dead9cb967898079c3960c0ab: Status 404 returned error can't find the container with id fc502cde41cf2513cf9950c47b47065a45736b0dead9cb967898079c3960c0ab Oct 10 06:41:25 crc kubenswrapper[4822]: I1010 06:41:25.454087 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 06:41:25 crc kubenswrapper[4822]: I1010 06:41:25.521374 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zd66g"] Oct 10 06:41:25 crc kubenswrapper[4822]: W1010 06:41:25.522360 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd282c7c5_dac8_407d_a65c_7b3532cbbaff.slice/crio-ced9746914569423bc19c9ef346803f8626b53a0b9cd01354057d3df54139c2c WatchSource:0}: Error finding container ced9746914569423bc19c9ef346803f8626b53a0b9cd01354057d3df54139c2c: Status 404 returned error can't find the container with id ced9746914569423bc19c9ef346803f8626b53a0b9cd01354057d3df54139c2c Oct 10 06:41:26 crc kubenswrapper[4822]: I1010 06:41:26.354742 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" event={"ID":"503600d6-3ea7-4309-a1a1-9d412b78dd6a","Type":"ContainerStarted","Data":"fc502cde41cf2513cf9950c47b47065a45736b0dead9cb967898079c3960c0ab"} Oct 10 06:41:26 crc kubenswrapper[4822]: I1010 06:41:26.359465 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" event={"ID":"d282c7c5-dac8-407d-a65c-7b3532cbbaff","Type":"ContainerStarted","Data":"ced9746914569423bc19c9ef346803f8626b53a0b9cd01354057d3df54139c2c"} Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.582889 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8pjkd"] Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.598656 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9bvjs"] Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.600181 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.617021 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9bvjs"] Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.681695 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.681758 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-config\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.681782 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lwv\" (UniqueName: \"kubernetes.io/projected/e3ff5738-5692-449f-a488-7fd2d202590d-kube-api-access-w8lwv\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.805880 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.805986 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-config\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.806012 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lwv\" (UniqueName: \"kubernetes.io/projected/e3ff5738-5692-449f-a488-7fd2d202590d-kube-api-access-w8lwv\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.806883 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-config\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.806985 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.831830 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lwv\" (UniqueName: \"kubernetes.io/projected/e3ff5738-5692-449f-a488-7fd2d202590d-kube-api-access-w8lwv\") pod \"dnsmasq-dns-5ccc8479f9-9bvjs\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.891635 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zd66g"] Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.923662 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-96q8h"] Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.924978 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.925836 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:27 crc kubenswrapper[4822]: I1010 06:41:27.938656 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-96q8h"] Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.011426 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-config\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.011488 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.011594 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd498\" (UniqueName: \"kubernetes.io/projected/9f6eb39a-30d8-4484-b987-25fa4582ab89-kube-api-access-vd498\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.115584 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd498\" (UniqueName: \"kubernetes.io/projected/9f6eb39a-30d8-4484-b987-25fa4582ab89-kube-api-access-vd498\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.116142 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-config\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.116178 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.117180 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-config\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.118933 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.143956 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd498\" (UniqueName: \"kubernetes.io/projected/9f6eb39a-30d8-4484-b987-25fa4582ab89-kube-api-access-vd498\") pod \"dnsmasq-dns-57d769cc4f-96q8h\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.255751 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.544630 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9bvjs"] Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.735690 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.739453 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.744549 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.744610 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.744751 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.744790 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.744923 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.745045 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9bb2b" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.745073 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.749303 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.799561 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-96q8h"] Oct 10 06:41:28 crc kubenswrapper[4822]: W1010 06:41:28.812265 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f6eb39a_30d8_4484_b987_25fa4582ab89.slice/crio-b2992015c5eb4cf036615aedca7aae7069637cbac09b93eeaa1350ff36e35de3 WatchSource:0}: Error finding container b2992015c5eb4cf036615aedca7aae7069637cbac09b93eeaa1350ff36e35de3: Status 404 returned error can't find the container with id b2992015c5eb4cf036615aedca7aae7069637cbac09b93eeaa1350ff36e35de3 Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.830869 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxh6\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-kube-api-access-5bxh6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.830941 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.830992 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831021 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831073 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831111 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831153 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831181 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fa59157-6b4e-4379-89e0-415e74c581a8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831310 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.831345 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fa59157-6b4e-4379-89e0-415e74c581a8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933454 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fa59157-6b4e-4379-89e0-415e74c581a8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933559 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxh6\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-kube-api-access-5bxh6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933612 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933676 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933703 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933725 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933795 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933847 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933866 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933919 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fa59157-6b4e-4379-89e0-415e74c581a8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.933967 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.935216 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.935633 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.936270 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.936593 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.936946 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.937515 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.942036 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.944721 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.946939 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fa59157-6b4e-4379-89e0-415e74c581a8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.949719 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fa59157-6b4e-4379-89e0-415e74c581a8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.953161 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxh6\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-kube-api-access-5bxh6\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:28 crc kubenswrapper[4822]: I1010 06:41:28.981967 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.045967 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.047621 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051094 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051441 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v25fz" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051464 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051490 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051617 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051712 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.051826 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.054014 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.083594 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.136714 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.136770 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48fba34a-0289-41f0-b1d7-bb71a22253a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.136904 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tc4\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-kube-api-access-h2tc4\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.136977 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137037 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137070 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137114 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137186 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137244 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48fba34a-0289-41f0-b1d7-bb71a22253a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.137462 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.241609 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48fba34a-0289-41f0-b1d7-bb71a22253a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.241680 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.241719 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.241738 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48fba34a-0289-41f0-b1d7-bb71a22253a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.242323 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tc4\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-kube-api-access-h2tc4\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.242654 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.242922 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.242988 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.243021 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.243068 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.243129 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.243174 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.243263 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.243746 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.244177 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.244452 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.244636 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.248770 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48fba34a-0289-41f0-b1d7-bb71a22253a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.249533 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.249562 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.256277 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48fba34a-0289-41f0-b1d7-bb71a22253a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.268202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.269442 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tc4\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-kube-api-access-h2tc4\") pod \"rabbitmq-server-0\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.375719 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.382610 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" event={"ID":"9f6eb39a-30d8-4484-b987-25fa4582ab89","Type":"ContainerStarted","Data":"b2992015c5eb4cf036615aedca7aae7069637cbac09b93eeaa1350ff36e35de3"} Oct 10 06:41:29 crc kubenswrapper[4822]: I1010 06:41:29.385071 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" event={"ID":"e3ff5738-5692-449f-a488-7fd2d202590d","Type":"ContainerStarted","Data":"34fb55ee1ffc2847c25481e5f21d821232edfe9e6a508a73d5830c8edc645121"} Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.243383 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.244675 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.247958 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.248224 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nspr7" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.249666 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.249677 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.250366 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256473 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-secrets\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256528 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256571 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-default\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256600 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256636 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256661 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7s5\" (UniqueName: \"kubernetes.io/projected/ebe2e09c-1139-449c-919b-206fbe0614ab-kube-api-access-kx7s5\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256690 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256747 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.256796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-kolla-config\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.261620 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.263854 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358033 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-secrets\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358091 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358131 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-default\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358154 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358187 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358210 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7s5\" (UniqueName: \"kubernetes.io/projected/ebe2e09c-1139-449c-919b-206fbe0614ab-kube-api-access-kx7s5\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358234 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358272 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.358314 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-kolla-config\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.359482 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.359489 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.359549 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-default\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.359919 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-kolla-config\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.361532 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.363506 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.363972 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-secrets\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.380746 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.397015 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7s5\" (UniqueName: \"kubernetes.io/projected/ebe2e09c-1139-449c-919b-206fbe0614ab-kube-api-access-kx7s5\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.398683 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " pod="openstack/openstack-galera-0" Oct 10 06:41:30 crc kubenswrapper[4822]: I1010 06:41:30.565255 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.612941 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.614496 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.625485 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mqkzb" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.626079 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.626498 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.631367 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.634664 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.771102 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.772095 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.775575 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.775649 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-622tp" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.775768 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779612 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5drg2\" (UniqueName: \"kubernetes.io/projected/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kube-api-access-5drg2\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779678 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779722 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779770 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779821 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779864 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779903 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779921 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.779953 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.789292 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881670 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvh9\" (UniqueName: \"kubernetes.io/projected/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kube-api-access-4jvh9\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881738 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881780 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881868 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881897 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881928 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881963 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.881999 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-config-data\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.882027 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5drg2\" (UniqueName: \"kubernetes.io/projected/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kube-api-access-5drg2\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.882062 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.882090 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.882129 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kolla-config\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.882161 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.882625 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.883315 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.883368 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.883514 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.885473 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.886566 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.886620 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.887909 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.905156 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.908529 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5drg2\" (UniqueName: \"kubernetes.io/projected/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kube-api-access-5drg2\") pod \"openstack-cell1-galera-0\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.939221 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.983905 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.983957 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.984004 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-config-data\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.984055 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kolla-config\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.984090 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvh9\" (UniqueName: \"kubernetes.io/projected/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kube-api-access-4jvh9\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.984963 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kolla-config\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.985070 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-config-data\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.989248 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.989836 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:31 crc kubenswrapper[4822]: I1010 06:41:31.998608 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvh9\" (UniqueName: \"kubernetes.io/projected/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kube-api-access-4jvh9\") pod \"memcached-0\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " pod="openstack/memcached-0" Oct 10 06:41:32 crc kubenswrapper[4822]: I1010 06:41:32.092200 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 06:41:33 crc kubenswrapper[4822]: I1010 06:41:33.852659 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:41:33 crc kubenswrapper[4822]: I1010 06:41:33.854281 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:41:33 crc kubenswrapper[4822]: I1010 06:41:33.862340 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hk2fs" Oct 10 06:41:33 crc kubenswrapper[4822]: I1010 06:41:33.865882 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:41:34 crc kubenswrapper[4822]: I1010 06:41:34.015366 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5tn\" (UniqueName: \"kubernetes.io/projected/ca7c7319-b717-4031-b211-5dfa1c501003-kube-api-access-wm5tn\") pod \"kube-state-metrics-0\" (UID: \"ca7c7319-b717-4031-b211-5dfa1c501003\") " pod="openstack/kube-state-metrics-0" Oct 10 06:41:34 crc kubenswrapper[4822]: I1010 06:41:34.116276 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5tn\" (UniqueName: \"kubernetes.io/projected/ca7c7319-b717-4031-b211-5dfa1c501003-kube-api-access-wm5tn\") pod \"kube-state-metrics-0\" (UID: \"ca7c7319-b717-4031-b211-5dfa1c501003\") " pod="openstack/kube-state-metrics-0" Oct 10 06:41:34 crc kubenswrapper[4822]: I1010 06:41:34.135893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5tn\" (UniqueName: \"kubernetes.io/projected/ca7c7319-b717-4031-b211-5dfa1c501003-kube-api-access-wm5tn\") pod \"kube-state-metrics-0\" (UID: \"ca7c7319-b717-4031-b211-5dfa1c501003\") " pod="openstack/kube-state-metrics-0" Oct 10 06:41:34 crc kubenswrapper[4822]: I1010 06:41:34.196201 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.561513 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5jdbx"] Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.564253 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.567320 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.567389 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fc7dw" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.567565 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.585618 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-nrgt7"] Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.587784 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.602500 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jdbx"] Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.616610 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nrgt7"] Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.674768 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-combined-ca-bundle\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.674824 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5adf58a-6071-48bd-8e95-2a664f10d551-scripts\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.674942 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run-ovn\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675001 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-ovn-controller-tls-certs\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675029 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-lib\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675081 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-run\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675134 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-etc-ovs\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675158 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-log\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675264 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxx94\" (UniqueName: \"kubernetes.io/projected/b5adf58a-6071-48bd-8e95-2a664f10d551-kube-api-access-lxx94\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675318 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-log-ovn\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675343 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675463 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27c9c088-64aa-44cd-8e1d-5e007e0d309b-scripts\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.675525 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tk9\" (UniqueName: \"kubernetes.io/projected/27c9c088-64aa-44cd-8e1d-5e007e0d309b-kube-api-access-h2tk9\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777401 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27c9c088-64aa-44cd-8e1d-5e007e0d309b-scripts\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777446 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tk9\" (UniqueName: \"kubernetes.io/projected/27c9c088-64aa-44cd-8e1d-5e007e0d309b-kube-api-access-h2tk9\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777490 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-combined-ca-bundle\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777509 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5adf58a-6071-48bd-8e95-2a664f10d551-scripts\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777529 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run-ovn\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777549 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-ovn-controller-tls-certs\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777572 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-lib\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777594 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-run\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777653 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-etc-ovs\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777672 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-log\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777740 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxx94\" (UniqueName: \"kubernetes.io/projected/b5adf58a-6071-48bd-8e95-2a664f10d551-kube-api-access-lxx94\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777761 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-log-ovn\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.777777 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.778302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.782461 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27c9c088-64aa-44cd-8e1d-5e007e0d309b-scripts\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.785828 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5adf58a-6071-48bd-8e95-2a664f10d551-scripts\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.785953 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run-ovn\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.788289 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-ovn-controller-tls-certs\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.788407 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-etc-ovs\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.788683 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-lib\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.788741 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-run\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.788787 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-log\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.789484 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-log-ovn\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.794962 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-combined-ca-bundle\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.799252 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tk9\" (UniqueName: \"kubernetes.io/projected/27c9c088-64aa-44cd-8e1d-5e007e0d309b-kube-api-access-h2tk9\") pod \"ovn-controller-5jdbx\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.806669 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.808285 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.816510 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxx94\" (UniqueName: \"kubernetes.io/projected/b5adf58a-6071-48bd-8e95-2a664f10d551-kube-api-access-lxx94\") pod \"ovn-controller-ovs-nrgt7\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.816762 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.817065 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.817214 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ldfh2" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.817406 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.817969 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.822221 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.892568 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.913104 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.988136 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.988495 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.988655 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bq6b\" (UniqueName: \"kubernetes.io/projected/108483bc-0a52-4ac2-8086-fa89466ea3aa-kube-api-access-5bq6b\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.988834 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.989071 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.989141 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.989181 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:37 crc kubenswrapper[4822]: I1010 06:41:37.989229 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.090890 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.090950 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.090996 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091013 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091035 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bq6b\" (UniqueName: \"kubernetes.io/projected/108483bc-0a52-4ac2-8086-fa89466ea3aa-kube-api-access-5bq6b\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091078 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091114 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091139 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091467 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091839 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.091975 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.092367 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.095873 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.105479 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.105713 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.108608 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bq6b\" (UniqueName: \"kubernetes.io/projected/108483bc-0a52-4ac2-8086-fa89466ea3aa-kube-api-access-5bq6b\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.114789 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:38 crc kubenswrapper[4822]: I1010 06:41:38.169275 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.452506 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.455274 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.457630 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-96s4q" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.457958 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.458177 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.458366 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.486079 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560116 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22236db0-c666-44e4-a290-66626e76cdad-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560204 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-config\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560231 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560256 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560290 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560332 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgdz\" (UniqueName: \"kubernetes.io/projected/22236db0-c666-44e4-a290-66626e76cdad-kube-api-access-rdgdz\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560354 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.560405 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662086 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22236db0-c666-44e4-a290-66626e76cdad-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662151 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-config\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662169 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662189 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662215 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662251 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgdz\" (UniqueName: \"kubernetes.io/projected/22236db0-c666-44e4-a290-66626e76cdad-kube-api-access-rdgdz\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662270 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.662304 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.663976 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.664207 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22236db0-c666-44e4-a290-66626e76cdad-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.664210 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-config\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.664509 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.668205 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.668577 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.670309 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.691384 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.698359 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgdz\" (UniqueName: \"kubernetes.io/projected/22236db0-c666-44e4-a290-66626e76cdad-kube-api-access-rdgdz\") pod \"ovsdbserver-sb-0\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:40 crc kubenswrapper[4822]: I1010 06:41:40.784293 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:41 crc kubenswrapper[4822]: E1010 06:41:41.635275 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 10 06:41:41 crc kubenswrapper[4822]: E1010 06:41:41.635657 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sk6pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zd66g_openstack(d282c7c5-dac8-407d-a65c-7b3532cbbaff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 06:41:41 crc kubenswrapper[4822]: E1010 06:41:41.636883 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" podUID="d282c7c5-dac8-407d-a65c-7b3532cbbaff" Oct 10 06:41:41 crc kubenswrapper[4822]: E1010 06:41:41.722224 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 10 06:41:41 crc kubenswrapper[4822]: E1010 06:41:41.722383 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrw9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-8pjkd_openstack(503600d6-3ea7-4309-a1a1-9d412b78dd6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 06:41:41 crc kubenswrapper[4822]: E1010 06:41:41.724206 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" podUID="503600d6-3ea7-4309-a1a1-9d412b78dd6a" Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.282459 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.517475 4822 generic.go:334] "Generic (PLEG): container finished" podID="e3ff5738-5692-449f-a488-7fd2d202590d" containerID="4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb" exitCode=0 Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.517560 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" event={"ID":"e3ff5738-5692-449f-a488-7fd2d202590d","Type":"ContainerDied","Data":"4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb"} Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.520848 4822 generic.go:334] "Generic (PLEG): container finished" podID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerID="2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c" exitCode=0 Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.520947 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" event={"ID":"9f6eb39a-30d8-4484-b987-25fa4582ab89","Type":"ContainerDied","Data":"2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c"} Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.522311 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f","Type":"ContainerStarted","Data":"29ea42e8823ceb5ead644f658fee9bfd0f2ceee1d79139c6f2ba13e3b3147a27"} Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.610717 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: W1010 06:41:42.629832 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe2e09c_1139_449c_919b_206fbe0614ab.slice/crio-10897d820ebbf299d03df7a8376ecfb2b3e358c5a11ed6b377f9b97862f6ed5c WatchSource:0}: Error finding container 10897d820ebbf299d03df7a8376ecfb2b3e358c5a11ed6b377f9b97862f6ed5c: Status 404 returned error can't find the container with id 10897d820ebbf299d03df7a8376ecfb2b3e358c5a11ed6b377f9b97862f6ed5c Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.637035 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.662169 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.676497 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.683431 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.687653 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jdbx"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.786115 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 06:41:42 crc kubenswrapper[4822]: E1010 06:41:42.807854 4822 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 10 06:41:42 crc kubenswrapper[4822]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e3ff5738-5692-449f-a488-7fd2d202590d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 10 06:41:42 crc kubenswrapper[4822]: > podSandboxID="34fb55ee1ffc2847c25481e5f21d821232edfe9e6a508a73d5830c8edc645121" Oct 10 06:41:42 crc kubenswrapper[4822]: E1010 06:41:42.807999 4822 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 10 06:41:42 crc kubenswrapper[4822]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8lwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-9bvjs_openstack(e3ff5738-5692-449f-a488-7fd2d202590d): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e3ff5738-5692-449f-a488-7fd2d202590d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 10 06:41:42 crc kubenswrapper[4822]: > logger="UnhandledError" Oct 10 06:41:42 crc kubenswrapper[4822]: E1010 06:41:42.809068 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e3ff5738-5692-449f-a488-7fd2d202590d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.840965 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nrgt7"] Oct 10 06:41:42 crc kubenswrapper[4822]: I1010 06:41:42.994991 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.002720 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.014382 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrw9b\" (UniqueName: \"kubernetes.io/projected/503600d6-3ea7-4309-a1a1-9d412b78dd6a-kube-api-access-rrw9b\") pod \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.014421 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503600d6-3ea7-4309-a1a1-9d412b78dd6a-config\") pod \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\" (UID: \"503600d6-3ea7-4309-a1a1-9d412b78dd6a\") " Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.014456 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk6pl\" (UniqueName: \"kubernetes.io/projected/d282c7c5-dac8-407d-a65c-7b3532cbbaff-kube-api-access-sk6pl\") pod \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.014494 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-dns-svc\") pod \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.014587 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-config\") pod \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\" (UID: \"d282c7c5-dac8-407d-a65c-7b3532cbbaff\") " Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.015198 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/503600d6-3ea7-4309-a1a1-9d412b78dd6a-config" (OuterVolumeSpecName: "config") pod "503600d6-3ea7-4309-a1a1-9d412b78dd6a" (UID: "503600d6-3ea7-4309-a1a1-9d412b78dd6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.015661 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d282c7c5-dac8-407d-a65c-7b3532cbbaff" (UID: "d282c7c5-dac8-407d-a65c-7b3532cbbaff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.015954 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-config" (OuterVolumeSpecName: "config") pod "d282c7c5-dac8-407d-a65c-7b3532cbbaff" (UID: "d282c7c5-dac8-407d-a65c-7b3532cbbaff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.021689 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d282c7c5-dac8-407d-a65c-7b3532cbbaff-kube-api-access-sk6pl" (OuterVolumeSpecName: "kube-api-access-sk6pl") pod "d282c7c5-dac8-407d-a65c-7b3532cbbaff" (UID: "d282c7c5-dac8-407d-a65c-7b3532cbbaff"). InnerVolumeSpecName "kube-api-access-sk6pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.027165 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503600d6-3ea7-4309-a1a1-9d412b78dd6a-kube-api-access-rrw9b" (OuterVolumeSpecName: "kube-api-access-rrw9b") pod "503600d6-3ea7-4309-a1a1-9d412b78dd6a" (UID: "503600d6-3ea7-4309-a1a1-9d412b78dd6a"). InnerVolumeSpecName "kube-api-access-rrw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.115884 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrw9b\" (UniqueName: \"kubernetes.io/projected/503600d6-3ea7-4309-a1a1-9d412b78dd6a-kube-api-access-rrw9b\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.115914 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503600d6-3ea7-4309-a1a1-9d412b78dd6a-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.115924 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk6pl\" (UniqueName: \"kubernetes.io/projected/d282c7c5-dac8-407d-a65c-7b3532cbbaff-kube-api-access-sk6pl\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.115934 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.115943 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d282c7c5-dac8-407d-a65c-7b3532cbbaff-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.349641 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 06:41:43 crc kubenswrapper[4822]: W1010 06:41:43.393893 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22236db0_c666_44e4_a290_66626e76cdad.slice/crio-f540e77c53df470bad8017634d1f6c2f94da2895ba72b0aba59dbab08662ea7c WatchSource:0}: Error finding container f540e77c53df470bad8017634d1f6c2f94da2895ba72b0aba59dbab08662ea7c: Status 404 returned error can't find the container with id f540e77c53df470bad8017634d1f6c2f94da2895ba72b0aba59dbab08662ea7c Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.532109 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22236db0-c666-44e4-a290-66626e76cdad","Type":"ContainerStarted","Data":"f540e77c53df470bad8017634d1f6c2f94da2895ba72b0aba59dbab08662ea7c"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.539153 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48fba34a-0289-41f0-b1d7-bb71a22253a3","Type":"ContainerStarted","Data":"1feeb41b38d8c35a54ee7dcc51b31b38edd8105b60ae5790db2e613cd851ac04"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.541422 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerStarted","Data":"285ad830049d2a7a50d439eb2ade0783f668170a94ba668dbb96ae44b83ddd41"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.543264 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" event={"ID":"503600d6-3ea7-4309-a1a1-9d412b78dd6a","Type":"ContainerDied","Data":"fc502cde41cf2513cf9950c47b47065a45736b0dead9cb967898079c3960c0ab"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.543283 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8pjkd" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.545504 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"108483bc-0a52-4ac2-8086-fa89466ea3aa","Type":"ContainerStarted","Data":"606e62ba4b6b8392d156769f63c12dbf600698294d4209f882b2b5af761dde6c"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.546823 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ebe2e09c-1139-449c-919b-206fbe0614ab","Type":"ContainerStarted","Data":"10897d820ebbf299d03df7a8376ecfb2b3e358c5a11ed6b377f9b97862f6ed5c"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.549264 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" event={"ID":"9f6eb39a-30d8-4484-b987-25fa4582ab89","Type":"ContainerStarted","Data":"ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.549304 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.557666 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.557664 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zd66g" event={"ID":"d282c7c5-dac8-407d-a65c-7b3532cbbaff","Type":"ContainerDied","Data":"ced9746914569423bc19c9ef346803f8626b53a0b9cd01354057d3df54139c2c"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.560570 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx" event={"ID":"27c9c088-64aa-44cd-8e1d-5e007e0d309b","Type":"ContainerStarted","Data":"26207bf121036d4614299d8c3674c766a097beef5ad362bb997783a2cb1eae80"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.562585 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1fa59157-6b4e-4379-89e0-415e74c581a8","Type":"ContainerStarted","Data":"3cb6b0d8598096cfa4a9302c0c53a869fca682262baf741257f4945f73e789e9"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.565511 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ca7c7319-b717-4031-b211-5dfa1c501003","Type":"ContainerStarted","Data":"7222b4422ec7269c0efea74e8ff34b68cd70102eb95b90cededb1242f90f5f6c"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.569993 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e5c43dc5-0a44-497d-8d7c-3a818ddf1735","Type":"ContainerStarted","Data":"7abbad523e93f4274c96aa54fd62cf3a2176d5ba999ead6cbca3d1ac1331ffe8"} Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.571294 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" podStartSLOduration=3.64451259 podStartE2EDuration="16.571281269s" podCreationTimestamp="2025-10-10 06:41:27 +0000 UTC" firstStartedPulling="2025-10-10 06:41:28.814490607 +0000 UTC m=+1035.909648803" lastFinishedPulling="2025-10-10 06:41:41.741259286 +0000 UTC m=+1048.836417482" observedRunningTime="2025-10-10 06:41:43.565700776 +0000 UTC m=+1050.660858982" watchObservedRunningTime="2025-10-10 06:41:43.571281269 +0000 UTC m=+1050.666439465" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.605720 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8pjkd"] Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.611135 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8pjkd"] Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.645966 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zd66g"] Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.667086 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503600d6-3ea7-4309-a1a1-9d412b78dd6a" path="/var/lib/kubelet/pods/503600d6-3ea7-4309-a1a1-9d412b78dd6a/volumes" Oct 10 06:41:43 crc kubenswrapper[4822]: I1010 06:41:43.667530 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zd66g"] Oct 10 06:41:45 crc kubenswrapper[4822]: I1010 06:41:45.659481 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d282c7c5-dac8-407d-a65c-7b3532cbbaff" path="/var/lib/kubelet/pods/d282c7c5-dac8-407d-a65c-7b3532cbbaff/volumes" Oct 10 06:41:48 crc kubenswrapper[4822]: I1010 06:41:48.257982 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:41:48 crc kubenswrapper[4822]: I1010 06:41:48.325628 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9bvjs"] Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.622842 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f","Type":"ContainerStarted","Data":"a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8"} Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.623430 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.628777 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" event={"ID":"e3ff5738-5692-449f-a488-7fd2d202590d","Type":"ContainerStarted","Data":"4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385"} Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.628933 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" containerName="dnsmasq-dns" containerID="cri-o://4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385" gracePeriod=10 Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.628955 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.653180 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.454763294 podStartE2EDuration="19.653161253s" podCreationTimestamp="2025-10-10 06:41:31 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.287932995 +0000 UTC m=+1049.383091191" lastFinishedPulling="2025-10-10 06:41:49.486330954 +0000 UTC m=+1056.581489150" observedRunningTime="2025-10-10 06:41:50.653097411 +0000 UTC m=+1057.748255627" watchObservedRunningTime="2025-10-10 06:41:50.653161253 +0000 UTC m=+1057.748319449" Oct 10 06:41:50 crc kubenswrapper[4822]: I1010 06:41:50.680506 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" podStartSLOduration=10.497289881 podStartE2EDuration="23.68049102s" podCreationTimestamp="2025-10-10 06:41:27 +0000 UTC" firstStartedPulling="2025-10-10 06:41:28.556728758 +0000 UTC m=+1035.651886954" lastFinishedPulling="2025-10-10 06:41:41.739929887 +0000 UTC m=+1048.835088093" observedRunningTime="2025-10-10 06:41:50.678594214 +0000 UTC m=+1057.773752410" watchObservedRunningTime="2025-10-10 06:41:50.68049102 +0000 UTC m=+1057.775649216" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.119855 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.266480 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-dns-svc\") pod \"e3ff5738-5692-449f-a488-7fd2d202590d\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.266538 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-config\") pod \"e3ff5738-5692-449f-a488-7fd2d202590d\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.266701 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lwv\" (UniqueName: \"kubernetes.io/projected/e3ff5738-5692-449f-a488-7fd2d202590d-kube-api-access-w8lwv\") pod \"e3ff5738-5692-449f-a488-7fd2d202590d\" (UID: \"e3ff5738-5692-449f-a488-7fd2d202590d\") " Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.274897 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ff5738-5692-449f-a488-7fd2d202590d-kube-api-access-w8lwv" (OuterVolumeSpecName: "kube-api-access-w8lwv") pod "e3ff5738-5692-449f-a488-7fd2d202590d" (UID: "e3ff5738-5692-449f-a488-7fd2d202590d"). InnerVolumeSpecName "kube-api-access-w8lwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.368153 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lwv\" (UniqueName: \"kubernetes.io/projected/e3ff5738-5692-449f-a488-7fd2d202590d-kube-api-access-w8lwv\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.463660 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-config" (OuterVolumeSpecName: "config") pod "e3ff5738-5692-449f-a488-7fd2d202590d" (UID: "e3ff5738-5692-449f-a488-7fd2d202590d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.469255 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.477521 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3ff5738-5692-449f-a488-7fd2d202590d" (UID: "e3ff5738-5692-449f-a488-7fd2d202590d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.570545 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ff5738-5692-449f-a488-7fd2d202590d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.637248 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22236db0-c666-44e4-a290-66626e76cdad","Type":"ContainerStarted","Data":"85d5df3b347c7c673d52d3c41f938ec4fc9dd982fd6b9414a88fc92d459a23ab"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.639249 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx" event={"ID":"27c9c088-64aa-44cd-8e1d-5e007e0d309b","Type":"ContainerStarted","Data":"8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.639453 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5jdbx" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.640976 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ca7c7319-b717-4031-b211-5dfa1c501003","Type":"ContainerStarted","Data":"81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.641094 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.642433 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerStarted","Data":"074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.645230 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"108483bc-0a52-4ac2-8086-fa89466ea3aa","Type":"ContainerStarted","Data":"365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.646657 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e5c43dc5-0a44-497d-8d7c-3a818ddf1735","Type":"ContainerStarted","Data":"d368a9283a474f279f3a9e7eb440163641f3b6b196b241c7c7adf0aa687133b9"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.648269 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ebe2e09c-1139-449c-919b-206fbe0614ab","Type":"ContainerStarted","Data":"145590554ea08262793a80e00ae8eeb2daf15d2d75801a13902413f227da8393"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.654788 4822 generic.go:334] "Generic (PLEG): container finished" podID="e3ff5738-5692-449f-a488-7fd2d202590d" containerID="4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385" exitCode=0 Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.655322 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.659755 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" event={"ID":"e3ff5738-5692-449f-a488-7fd2d202590d","Type":"ContainerDied","Data":"4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.659834 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9bvjs" event={"ID":"e3ff5738-5692-449f-a488-7fd2d202590d","Type":"ContainerDied","Data":"34fb55ee1ffc2847c25481e5f21d821232edfe9e6a508a73d5830c8edc645121"} Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.659857 4822 scope.go:117] "RemoveContainer" containerID="4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.663951 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5jdbx" podStartSLOduration=7.509995782 podStartE2EDuration="14.663933986s" podCreationTimestamp="2025-10-10 06:41:37 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.809592505 +0000 UTC m=+1049.904750701" lastFinishedPulling="2025-10-10 06:41:49.963530709 +0000 UTC m=+1057.058688905" observedRunningTime="2025-10-10 06:41:51.661358051 +0000 UTC m=+1058.756516277" watchObservedRunningTime="2025-10-10 06:41:51.663933986 +0000 UTC m=+1058.759092192" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.685028 4822 scope.go:117] "RemoveContainer" containerID="4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.760456 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.491391733 podStartE2EDuration="18.760433708s" podCreationTimestamp="2025-10-10 06:41:33 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.692444422 +0000 UTC m=+1049.787602618" lastFinishedPulling="2025-10-10 06:41:50.961486397 +0000 UTC m=+1058.056644593" observedRunningTime="2025-10-10 06:41:51.75604586 +0000 UTC m=+1058.851204056" watchObservedRunningTime="2025-10-10 06:41:51.760433708 +0000 UTC m=+1058.855591904" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.772921 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9bvjs"] Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.781321 4822 scope.go:117] "RemoveContainer" containerID="4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385" Oct 10 06:41:51 crc kubenswrapper[4822]: E1010 06:41:51.781899 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385\": container with ID starting with 4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385 not found: ID does not exist" containerID="4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.781947 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385"} err="failed to get container status \"4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385\": rpc error: code = NotFound desc = could not find container \"4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385\": container with ID starting with 4867c9f753b93fd0f014688388008967566466d928f6c74d1893a2195d43e385 not found: ID does not exist" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.781979 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9bvjs"] Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.781991 4822 scope.go:117] "RemoveContainer" containerID="4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb" Oct 10 06:41:51 crc kubenswrapper[4822]: E1010 06:41:51.782558 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb\": container with ID starting with 4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb not found: ID does not exist" containerID="4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb" Oct 10 06:41:51 crc kubenswrapper[4822]: I1010 06:41:51.782593 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb"} err="failed to get container status \"4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb\": rpc error: code = NotFound desc = could not find container \"4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb\": container with ID starting with 4e5ddbe1305efaee79b1179c9e3f7df2337e71f2e44b6fc08a0c0e1fb4e26bfb not found: ID does not exist" Oct 10 06:41:52 crc kubenswrapper[4822]: I1010 06:41:52.667291 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1fa59157-6b4e-4379-89e0-415e74c581a8","Type":"ContainerStarted","Data":"eea57be423b28bbdd8158cba171eb7c5cb45f3b6969aa019c1c3bb74047f15a3"} Oct 10 06:41:52 crc kubenswrapper[4822]: I1010 06:41:52.668843 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48fba34a-0289-41f0-b1d7-bb71a22253a3","Type":"ContainerStarted","Data":"32c5a9d6afcaeb38dc9800deb4a9ed3a02f085527940c8caf49522b1f31a6f55"} Oct 10 06:41:52 crc kubenswrapper[4822]: I1010 06:41:52.670795 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerID="074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992" exitCode=0 Oct 10 06:41:52 crc kubenswrapper[4822]: I1010 06:41:52.670992 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerDied","Data":"074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992"} Oct 10 06:41:53 crc kubenswrapper[4822]: I1010 06:41:53.661023 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" path="/var/lib/kubelet/pods/e3ff5738-5692-449f-a488-7fd2d202590d/volumes" Oct 10 06:41:53 crc kubenswrapper[4822]: I1010 06:41:53.679772 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerStarted","Data":"3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81"} Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.690430 4822 generic.go:334] "Generic (PLEG): container finished" podID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerID="145590554ea08262793a80e00ae8eeb2daf15d2d75801a13902413f227da8393" exitCode=0 Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.690521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ebe2e09c-1139-449c-919b-206fbe0614ab","Type":"ContainerDied","Data":"145590554ea08262793a80e00ae8eeb2daf15d2d75801a13902413f227da8393"} Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.694545 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22236db0-c666-44e4-a290-66626e76cdad","Type":"ContainerStarted","Data":"c3480d312ca94790932ca6354883bc03b463bd75e6035ae95f7c287a9d29849c"} Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.702550 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerStarted","Data":"84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb"} Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.702833 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.702953 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.704571 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"108483bc-0a52-4ac2-8086-fa89466ea3aa","Type":"ContainerStarted","Data":"02cf6c43ff65c5cfd73aa24c2571a632c72706b61bec1beff2c761fecf4a63ac"} Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.780207 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-nrgt7" podStartSLOduration=10.891529574 podStartE2EDuration="17.780179458s" podCreationTimestamp="2025-10-10 06:41:37 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.86637385 +0000 UTC m=+1049.961532046" lastFinishedPulling="2025-10-10 06:41:49.755023734 +0000 UTC m=+1056.850181930" observedRunningTime="2025-10-10 06:41:54.774527974 +0000 UTC m=+1061.869686180" watchObservedRunningTime="2025-10-10 06:41:54.780179458 +0000 UTC m=+1061.875337654" Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.806845 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.353326119 podStartE2EDuration="18.806830995s" podCreationTimestamp="2025-10-10 06:41:36 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.866655648 +0000 UTC m=+1049.961813844" lastFinishedPulling="2025-10-10 06:41:54.320160524 +0000 UTC m=+1061.415318720" observedRunningTime="2025-10-10 06:41:54.805247129 +0000 UTC m=+1061.900405325" watchObservedRunningTime="2025-10-10 06:41:54.806830995 +0000 UTC m=+1061.901989191" Oct 10 06:41:54 crc kubenswrapper[4822]: I1010 06:41:54.836147 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.895704132 podStartE2EDuration="15.836123928s" podCreationTimestamp="2025-10-10 06:41:39 +0000 UTC" firstStartedPulling="2025-10-10 06:41:43.397728852 +0000 UTC m=+1050.492887038" lastFinishedPulling="2025-10-10 06:41:54.338148638 +0000 UTC m=+1061.433306834" observedRunningTime="2025-10-10 06:41:54.828260579 +0000 UTC m=+1061.923418795" watchObservedRunningTime="2025-10-10 06:41:54.836123928 +0000 UTC m=+1061.931282124" Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.719257 4822 generic.go:334] "Generic (PLEG): container finished" podID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerID="d368a9283a474f279f3a9e7eb440163641f3b6b196b241c7c7adf0aa687133b9" exitCode=0 Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.719340 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e5c43dc5-0a44-497d-8d7c-3a818ddf1735","Type":"ContainerDied","Data":"d368a9283a474f279f3a9e7eb440163641f3b6b196b241c7c7adf0aa687133b9"} Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.726365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ebe2e09c-1139-449c-919b-206fbe0614ab","Type":"ContainerStarted","Data":"b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a"} Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.768264 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.332939186 podStartE2EDuration="26.768246729s" podCreationTimestamp="2025-10-10 06:41:29 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.633084502 +0000 UTC m=+1049.728242708" lastFinishedPulling="2025-10-10 06:41:50.068392055 +0000 UTC m=+1057.163550251" observedRunningTime="2025-10-10 06:41:55.762652706 +0000 UTC m=+1062.857810922" watchObservedRunningTime="2025-10-10 06:41:55.768246729 +0000 UTC m=+1062.863404925" Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.785106 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.785153 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:55 crc kubenswrapper[4822]: I1010 06:41:55.841513 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.170873 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.203647 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.736613 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e5c43dc5-0a44-497d-8d7c-3a818ddf1735","Type":"ContainerStarted","Data":"cea6be947bf52e2474312230db60ef4f19b18c57064bc9913aa1efa9a6406d53"} Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.738505 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.757436 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.717397648 podStartE2EDuration="26.757412323s" podCreationTimestamp="2025-10-10 06:41:30 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.715174964 +0000 UTC m=+1049.810333150" lastFinishedPulling="2025-10-10 06:41:49.755189629 +0000 UTC m=+1056.850347825" observedRunningTime="2025-10-10 06:41:56.75457853 +0000 UTC m=+1063.849736756" watchObservedRunningTime="2025-10-10 06:41:56.757412323 +0000 UTC m=+1063.852570519" Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.793695 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 10 06:41:56 crc kubenswrapper[4822]: I1010 06:41:56.811093 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.041976 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zp5pg"] Oct 10 06:41:57 crc kubenswrapper[4822]: E1010 06:41:57.042311 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" containerName="dnsmasq-dns" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.042333 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" containerName="dnsmasq-dns" Oct 10 06:41:57 crc kubenswrapper[4822]: E1010 06:41:57.042356 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" containerName="init" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.042364 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" containerName="init" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.042554 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ff5738-5692-449f-a488-7fd2d202590d" containerName="dnsmasq-dns" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.043559 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.045691 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.074416 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zp5pg"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.092254 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6l7hn"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.093372 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.093633 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.095457 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.103499 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6l7hn"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.161939 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxqnb\" (UniqueName: \"kubernetes.io/projected/368b1404-4673-4c12-b38b-9212a588202e-kube-api-access-nxqnb\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.162463 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.162553 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-config\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.162726 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.189328 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zp5pg"] Oct 10 06:41:57 crc kubenswrapper[4822]: E1010 06:41:57.190019 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-nxqnb ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" podUID="368b1404-4673-4c12-b38b-9212a588202e" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.213182 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w76zw"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.214861 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.219753 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.225857 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w76zw"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.264584 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265224 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-config\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265299 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovn-rundir\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265344 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265424 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxqnb\" (UniqueName: \"kubernetes.io/projected/368b1404-4673-4c12-b38b-9212a588202e-kube-api-access-nxqnb\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265499 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-combined-ca-bundle\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265529 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovs-rundir\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265590 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241b1f65-5edb-4965-b9af-e8e12b73124c-config\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265618 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.265660 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct98x\" (UniqueName: \"kubernetes.io/projected/241b1f65-5edb-4965-b9af-e8e12b73124c-kube-api-access-ct98x\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.266723 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.268289 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-config\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.270415 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.307586 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxqnb\" (UniqueName: \"kubernetes.io/projected/368b1404-4673-4c12-b38b-9212a588202e-kube-api-access-nxqnb\") pod \"dnsmasq-dns-7fd796d7df-zp5pg\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.311834 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.313047 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.323077 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.323085 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.323078 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.323323 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xkbdc" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.331387 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366547 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366620 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366643 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct98x\" (UniqueName: \"kubernetes.io/projected/241b1f65-5edb-4965-b9af-e8e12b73124c-kube-api-access-ct98x\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366663 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366845 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovn-rundir\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366877 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366900 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46lp9\" (UniqueName: \"kubernetes.io/projected/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-kube-api-access-46lp9\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366939 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.366986 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-combined-ca-bundle\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.367010 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovs-rundir\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.367091 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241b1f65-5edb-4965-b9af-e8e12b73124c-config\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.367202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovs-rundir\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.367506 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovn-rundir\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.367869 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241b1f65-5edb-4965-b9af-e8e12b73124c-config\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.377413 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-combined-ca-bundle\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.377468 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.381623 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct98x\" (UniqueName: \"kubernetes.io/projected/241b1f65-5edb-4965-b9af-e8e12b73124c-kube-api-access-ct98x\") pod \"ovn-controller-metrics-6l7hn\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.407890 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468188 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-config\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468319 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468355 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468448 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468491 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468516 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-scripts\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468542 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468568 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468589 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468668 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468686 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqmg\" (UniqueName: \"kubernetes.io/projected/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-kube-api-access-wdqmg\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.468716 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46lp9\" (UniqueName: \"kubernetes.io/projected/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-kube-api-access-46lp9\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.469210 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.469708 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.469909 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.470141 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.485374 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46lp9\" (UniqueName: \"kubernetes.io/projected/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-kube-api-access-46lp9\") pod \"dnsmasq-dns-86db49b7ff-w76zw\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.528458 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569597 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqmg\" (UniqueName: \"kubernetes.io/projected/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-kube-api-access-wdqmg\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569685 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-config\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569734 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569837 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569879 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569904 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-scripts\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.569935 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.570413 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.570638 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-config\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.571236 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-scripts\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.575114 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.576082 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.576206 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.595768 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqmg\" (UniqueName: \"kubernetes.io/projected/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-kube-api-access-wdqmg\") pod \"ovn-northd-0\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.651603 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.688964 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6l7hn"] Oct 10 06:41:57 crc kubenswrapper[4822]: W1010 06:41:57.712003 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241b1f65_5edb_4965_b9af_e8e12b73124c.slice/crio-dae98b524e6eefa094b20d50b4a5d2c970bd592f62b81450d3ec13fb4760fc38 WatchSource:0}: Error finding container dae98b524e6eefa094b20d50b4a5d2c970bd592f62b81450d3ec13fb4760fc38: Status 404 returned error can't find the container with id dae98b524e6eefa094b20d50b4a5d2c970bd592f62b81450d3ec13fb4760fc38 Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.760879 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.760910 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6l7hn" event={"ID":"241b1f65-5edb-4965-b9af-e8e12b73124c","Type":"ContainerStarted","Data":"dae98b524e6eefa094b20d50b4a5d2c970bd592f62b81450d3ec13fb4760fc38"} Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.779079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.876244 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-ovsdbserver-nb\") pod \"368b1404-4673-4c12-b38b-9212a588202e\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.876598 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-dns-svc\") pod \"368b1404-4673-4c12-b38b-9212a588202e\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.876650 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxqnb\" (UniqueName: \"kubernetes.io/projected/368b1404-4673-4c12-b38b-9212a588202e-kube-api-access-nxqnb\") pod \"368b1404-4673-4c12-b38b-9212a588202e\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.876715 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-config\") pod \"368b1404-4673-4c12-b38b-9212a588202e\" (UID: \"368b1404-4673-4c12-b38b-9212a588202e\") " Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.877259 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "368b1404-4673-4c12-b38b-9212a588202e" (UID: "368b1404-4673-4c12-b38b-9212a588202e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.878041 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "368b1404-4673-4c12-b38b-9212a588202e" (UID: "368b1404-4673-4c12-b38b-9212a588202e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.878916 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-config" (OuterVolumeSpecName: "config") pod "368b1404-4673-4c12-b38b-9212a588202e" (UID: "368b1404-4673-4c12-b38b-9212a588202e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.881084 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368b1404-4673-4c12-b38b-9212a588202e-kube-api-access-nxqnb" (OuterVolumeSpecName: "kube-api-access-nxqnb") pod "368b1404-4673-4c12-b38b-9212a588202e" (UID: "368b1404-4673-4c12-b38b-9212a588202e"). InnerVolumeSpecName "kube-api-access-nxqnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.979216 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.979250 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.979262 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368b1404-4673-4c12-b38b-9212a588202e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:57 crc kubenswrapper[4822]: I1010 06:41:57.979271 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxqnb\" (UniqueName: \"kubernetes.io/projected/368b1404-4673-4c12-b38b-9212a588202e-kube-api-access-nxqnb\") on node \"crc\" DevicePath \"\"" Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.006562 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w76zw"] Oct 10 06:41:58 crc kubenswrapper[4822]: W1010 06:41:58.017509 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19d90dd9_d39e_4d83_a22a_2fdf5acfc03f.slice/crio-40703f2919d009b3773a206028166c919a48858cca0fcccc292e723c3342eae8 WatchSource:0}: Error finding container 40703f2919d009b3773a206028166c919a48858cca0fcccc292e723c3342eae8: Status 404 returned error can't find the container with id 40703f2919d009b3773a206028166c919a48858cca0fcccc292e723c3342eae8 Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.207237 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.774989 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6l7hn" event={"ID":"241b1f65-5edb-4965-b9af-e8e12b73124c","Type":"ContainerStarted","Data":"80e86c1e4ef3bdd1b551d89f9e5f8eac8c6d612c80b88f1c7b4a89756edf38ef"} Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.776514 4822 generic.go:334] "Generic (PLEG): container finished" podID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerID="63cef2f5ec36bd1ed3cedc913717949be7ff69a700f445cdfd7a48d68f6b791a" exitCode=0 Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.776584 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" event={"ID":"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f","Type":"ContainerDied","Data":"63cef2f5ec36bd1ed3cedc913717949be7ff69a700f445cdfd7a48d68f6b791a"} Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.776605 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" event={"ID":"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f","Type":"ContainerStarted","Data":"40703f2919d009b3773a206028166c919a48858cca0fcccc292e723c3342eae8"} Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.784132 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zp5pg" Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.786881 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f48ef71-8ab0-4ed4-a58c-78046ec184b6","Type":"ContainerStarted","Data":"8c8d5b8afcf4ef3c8110883c26cf163cbbbe90ef7916c4a1e71d7f4f73f1ab4b"} Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.801313 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6l7hn" podStartSLOduration=1.8012896170000001 podStartE2EDuration="1.801289617s" podCreationTimestamp="2025-10-10 06:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:41:58.793208442 +0000 UTC m=+1065.888366658" watchObservedRunningTime="2025-10-10 06:41:58.801289617 +0000 UTC m=+1065.896447813" Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.965105 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zp5pg"] Oct 10 06:41:58 crc kubenswrapper[4822]: I1010 06:41:58.970443 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zp5pg"] Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.683357 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368b1404-4673-4c12-b38b-9212a588202e" path="/var/lib/kubelet/pods/368b1404-4673-4c12-b38b-9212a588202e/volumes" Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.793674 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" event={"ID":"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f","Type":"ContainerStarted","Data":"5c1e25f60904191ad4309d120e7f7355a61998c856aae55af5d15a83675c2422"} Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.794007 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.796518 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f48ef71-8ab0-4ed4-a58c-78046ec184b6","Type":"ContainerStarted","Data":"a699c14506381f4dd996d947b346eb98c9c53450eafb1392ee285ce66c591795"} Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.796566 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f48ef71-8ab0-4ed4-a58c-78046ec184b6","Type":"ContainerStarted","Data":"964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569"} Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.796714 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.831077 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.6345106679999999 podStartE2EDuration="2.831062024s" podCreationTimestamp="2025-10-10 06:41:57 +0000 UTC" firstStartedPulling="2025-10-10 06:41:58.214091807 +0000 UTC m=+1065.309249993" lastFinishedPulling="2025-10-10 06:41:59.410643153 +0000 UTC m=+1066.505801349" observedRunningTime="2025-10-10 06:41:59.82888729 +0000 UTC m=+1066.924045496" watchObservedRunningTime="2025-10-10 06:41:59.831062024 +0000 UTC m=+1066.926220220" Oct 10 06:41:59 crc kubenswrapper[4822]: I1010 06:41:59.832021 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" podStartSLOduration=2.832015591 podStartE2EDuration="2.832015591s" podCreationTimestamp="2025-10-10 06:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:41:59.813828561 +0000 UTC m=+1066.908986787" watchObservedRunningTime="2025-10-10 06:41:59.832015591 +0000 UTC m=+1066.927173787" Oct 10 06:42:00 crc kubenswrapper[4822]: I1010 06:42:00.566551 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 10 06:42:00 crc kubenswrapper[4822]: I1010 06:42:00.566900 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 10 06:42:00 crc kubenswrapper[4822]: I1010 06:42:00.619872 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 10 06:42:00 crc kubenswrapper[4822]: I1010 06:42:00.838978 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.824683 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9hsvs"] Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.825730 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.854562 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9hsvs"] Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.939735 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.939828 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.950162 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxjk\" (UniqueName: \"kubernetes.io/projected/9bd98a0c-cdbf-437e-b488-c7c1f5c81326-kube-api-access-6qxjk\") pod \"keystone-db-create-9hsvs\" (UID: \"9bd98a0c-cdbf-437e-b488-c7c1f5c81326\") " pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.990510 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gm7d6"] Oct 10 06:42:01 crc kubenswrapper[4822]: I1010 06:42:01.991528 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.002267 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gm7d6"] Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.006043 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.052317 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxjk\" (UniqueName: \"kubernetes.io/projected/9bd98a0c-cdbf-437e-b488-c7c1f5c81326-kube-api-access-6qxjk\") pod \"keystone-db-create-9hsvs\" (UID: \"9bd98a0c-cdbf-437e-b488-c7c1f5c81326\") " pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.069465 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxjk\" (UniqueName: \"kubernetes.io/projected/9bd98a0c-cdbf-437e-b488-c7c1f5c81326-kube-api-access-6qxjk\") pod \"keystone-db-create-9hsvs\" (UID: \"9bd98a0c-cdbf-437e-b488-c7c1f5c81326\") " pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.149001 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.154263 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9g9c\" (UniqueName: \"kubernetes.io/projected/a9f56185-1fd1-420a-9634-e982d9644d21-kube-api-access-b9g9c\") pod \"placement-db-create-gm7d6\" (UID: \"a9f56185-1fd1-420a-9634-e982d9644d21\") " pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.258422 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9g9c\" (UniqueName: \"kubernetes.io/projected/a9f56185-1fd1-420a-9634-e982d9644d21-kube-api-access-b9g9c\") pod \"placement-db-create-gm7d6\" (UID: \"a9f56185-1fd1-420a-9634-e982d9644d21\") " pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.277051 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9g9c\" (UniqueName: \"kubernetes.io/projected/a9f56185-1fd1-420a-9634-e982d9644d21-kube-api-access-b9g9c\") pod \"placement-db-create-gm7d6\" (UID: \"a9f56185-1fd1-420a-9634-e982d9644d21\") " pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.309857 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:02 crc kubenswrapper[4822]: W1010 06:42:02.596984 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bd98a0c_cdbf_437e_b488_c7c1f5c81326.slice/crio-601721c9b4805b8f0166f0d582a93d7a241b2771897872a25298d0a4737c6f71 WatchSource:0}: Error finding container 601721c9b4805b8f0166f0d582a93d7a241b2771897872a25298d0a4737c6f71: Status 404 returned error can't find the container with id 601721c9b4805b8f0166f0d582a93d7a241b2771897872a25298d0a4737c6f71 Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.598360 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9hsvs"] Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.732912 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gm7d6"] Oct 10 06:42:02 crc kubenswrapper[4822]: W1010 06:42:02.737053 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f56185_1fd1_420a_9634_e982d9644d21.slice/crio-0c91f97caf619d60182504a8a5c92286ab365f9afd469a6ef1c3a41a17fd716c WatchSource:0}: Error finding container 0c91f97caf619d60182504a8a5c92286ab365f9afd469a6ef1c3a41a17fd716c: Status 404 returned error can't find the container with id 0c91f97caf619d60182504a8a5c92286ab365f9afd469a6ef1c3a41a17fd716c Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.817811 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9hsvs" event={"ID":"9bd98a0c-cdbf-437e-b488-c7c1f5c81326","Type":"ContainerStarted","Data":"601721c9b4805b8f0166f0d582a93d7a241b2771897872a25298d0a4737c6f71"} Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.818793 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gm7d6" event={"ID":"a9f56185-1fd1-420a-9634-e982d9644d21","Type":"ContainerStarted","Data":"0c91f97caf619d60182504a8a5c92286ab365f9afd469a6ef1c3a41a17fd716c"} Oct 10 06:42:02 crc kubenswrapper[4822]: I1010 06:42:02.867298 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.211191 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.315476 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w76zw"] Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.316174 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="dnsmasq-dns" containerID="cri-o://5c1e25f60904191ad4309d120e7f7355a61998c856aae55af5d15a83675c2422" gracePeriod=10 Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.318487 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.384565 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-2x4k7"] Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.386922 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.388554 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2x4k7"] Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.506818 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-config\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.506883 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.506910 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.507029 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-dns-svc\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.507100 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbr9r\" (UniqueName: \"kubernetes.io/projected/e6ea6247-cd17-4b6e-b71f-b916243da4b6-kube-api-access-pbr9r\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.608603 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-dns-svc\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.608696 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbr9r\" (UniqueName: \"kubernetes.io/projected/e6ea6247-cd17-4b6e-b71f-b916243da4b6-kube-api-access-pbr9r\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.609841 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-dns-svc\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.610209 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-config\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.610256 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.610299 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.611330 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.611580 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-config\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.612052 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.670602 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbr9r\" (UniqueName: \"kubernetes.io/projected/e6ea6247-cd17-4b6e-b71f-b916243da4b6-kube-api-access-pbr9r\") pod \"dnsmasq-dns-698758b865-2x4k7\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.708263 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.857089 4822 generic.go:334] "Generic (PLEG): container finished" podID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerID="5c1e25f60904191ad4309d120e7f7355a61998c856aae55af5d15a83675c2422" exitCode=0 Oct 10 06:42:04 crc kubenswrapper[4822]: I1010 06:42:04.857389 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" event={"ID":"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f","Type":"ContainerDied","Data":"5c1e25f60904191ad4309d120e7f7355a61998c856aae55af5d15a83675c2422"} Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.158981 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2x4k7"] Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.358830 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.366450 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.369182 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.369371 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.369482 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rglhx" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.369753 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.394240 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.523992 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p85j\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-kube-api-access-4p85j\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.524035 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.524158 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-cache\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.524186 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-lock\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.524210 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625090 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-cache\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625137 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-lock\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625155 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625220 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p85j\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-kube-api-access-4p85j\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625251 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: E1010 06:42:05.625353 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:42:05 crc kubenswrapper[4822]: E1010 06:42:05.625607 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625636 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-cache\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: E1010 06:42:05.625648 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:42:06.125633438 +0000 UTC m=+1073.220791634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : configmap "swift-ring-files" not found Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625761 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-lock\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.625944 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.649482 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p85j\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-kube-api-access-4p85j\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.657168 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:05 crc kubenswrapper[4822]: I1010 06:42:05.865674 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2x4k7" event={"ID":"e6ea6247-cd17-4b6e-b71f-b916243da4b6","Type":"ContainerStarted","Data":"ce135713c23aab06b42d6c8f97a8e2a499ae3c20864466bf0e3b3865558cedc0"} Oct 10 06:42:06 crc kubenswrapper[4822]: I1010 06:42:06.136162 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:06 crc kubenswrapper[4822]: E1010 06:42:06.136358 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:42:06 crc kubenswrapper[4822]: E1010 06:42:06.136593 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 06:42:06 crc kubenswrapper[4822]: E1010 06:42:06.136652 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:42:07.136633148 +0000 UTC m=+1074.231791344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : configmap "swift-ring-files" not found Oct 10 06:42:06 crc kubenswrapper[4822]: I1010 06:42:06.875411 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9hsvs" event={"ID":"9bd98a0c-cdbf-437e-b488-c7c1f5c81326","Type":"ContainerStarted","Data":"d43cccdc5c656da0d889ca4eaba683a10c7359c580ef7744e83ca9ceb5fcb33a"} Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.150773 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:07 crc kubenswrapper[4822]: E1010 06:42:07.151028 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:42:07 crc kubenswrapper[4822]: E1010 06:42:07.151068 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 06:42:07 crc kubenswrapper[4822]: E1010 06:42:07.151140 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:42:09.151116198 +0000 UTC m=+1076.246274404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : configmap "swift-ring-files" not found Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.269537 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mz7mk"] Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.270593 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.285859 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mz7mk"] Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.456310 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpr9\" (UniqueName: \"kubernetes.io/projected/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89-kube-api-access-8vpr9\") pod \"glance-db-create-mz7mk\" (UID: \"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89\") " pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.529762 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.558905 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpr9\" (UniqueName: \"kubernetes.io/projected/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89-kube-api-access-8vpr9\") pod \"glance-db-create-mz7mk\" (UID: \"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89\") " pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.593391 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpr9\" (UniqueName: \"kubernetes.io/projected/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89-kube-api-access-8vpr9\") pod \"glance-db-create-mz7mk\" (UID: \"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89\") " pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.887285 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gm7d6" event={"ID":"a9f56185-1fd1-420a-9634-e982d9644d21","Type":"ContainerStarted","Data":"52abefdc1abbbdf1d6e084d55a389e7f29cd378d32aa689238a2bdc490e4f2a0"} Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.890270 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.905003 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-9hsvs" podStartSLOduration=6.904982625 podStartE2EDuration="6.904982625s" podCreationTimestamp="2025-10-10 06:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:07.901448012 +0000 UTC m=+1074.996606218" watchObservedRunningTime="2025-10-10 06:42:07.904982625 +0000 UTC m=+1075.000140821" Oct 10 06:42:07 crc kubenswrapper[4822]: I1010 06:42:07.926696 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-gm7d6" podStartSLOduration=6.926649176 podStartE2EDuration="6.926649176s" podCreationTimestamp="2025-10-10 06:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:07.914337477 +0000 UTC m=+1075.009495693" watchObservedRunningTime="2025-10-10 06:42:07.926649176 +0000 UTC m=+1075.021807372" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.133900 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.270123 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-sb\") pod \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.270191 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46lp9\" (UniqueName: \"kubernetes.io/projected/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-kube-api-access-46lp9\") pod \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.270218 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config\") pod \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.270364 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-dns-svc\") pod \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.270437 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-nb\") pod \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.275266 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-kube-api-access-46lp9" (OuterVolumeSpecName: "kube-api-access-46lp9") pod "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" (UID: "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f"). InnerVolumeSpecName "kube-api-access-46lp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.308785 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" (UID: "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.310354 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" (UID: "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:08 crc kubenswrapper[4822]: E1010 06:42:08.311670 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config podName:19d90dd9-d39e-4d83-a22a-2fdf5acfc03f nodeName:}" failed. No retries permitted until 2025-10-10 06:42:08.811638573 +0000 UTC m=+1075.906796769 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config") pod "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" (UID: "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f") : error deleting /var/lib/kubelet/pods/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f/volume-subpaths: remove /var/lib/kubelet/pods/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f/volume-subpaths: no such file or directory Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.311993 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" (UID: "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.372088 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.372376 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.372388 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.372400 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46lp9\" (UniqueName: \"kubernetes.io/projected/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-kube-api-access-46lp9\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.415572 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mz7mk"] Oct 10 06:42:08 crc kubenswrapper[4822]: W1010 06:42:08.515949 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f6d2357_fbc8_4a1e_92af_1da00c2d7b89.slice/crio-e1433387b59d66f224a20cc9eafcf219d96b1ddaba92a3b611cfe488cacd1f1a WatchSource:0}: Error finding container e1433387b59d66f224a20cc9eafcf219d96b1ddaba92a3b611cfe488cacd1f1a: Status 404 returned error can't find the container with id e1433387b59d66f224a20cc9eafcf219d96b1ddaba92a3b611cfe488cacd1f1a Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.881579 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config\") pod \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\" (UID: \"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f\") " Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.882733 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config" (OuterVolumeSpecName: "config") pod "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" (UID: "19d90dd9-d39e-4d83-a22a-2fdf5acfc03f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.900071 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" event={"ID":"19d90dd9-d39e-4d83-a22a-2fdf5acfc03f","Type":"ContainerDied","Data":"40703f2919d009b3773a206028166c919a48858cca0fcccc292e723c3342eae8"} Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.900143 4822 scope.go:117] "RemoveContainer" containerID="5c1e25f60904191ad4309d120e7f7355a61998c856aae55af5d15a83675c2422" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.900174 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w76zw" Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.902946 4822 generic.go:334] "Generic (PLEG): container finished" podID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerID="cd98bfb27b3e9b37541c6b1c1099ebf97943f6cf600c4013ec6157d2daade251" exitCode=0 Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.903017 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2x4k7" event={"ID":"e6ea6247-cd17-4b6e-b71f-b916243da4b6","Type":"ContainerDied","Data":"cd98bfb27b3e9b37541c6b1c1099ebf97943f6cf600c4013ec6157d2daade251"} Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.910795 4822 generic.go:334] "Generic (PLEG): container finished" podID="9bd98a0c-cdbf-437e-b488-c7c1f5c81326" containerID="d43cccdc5c656da0d889ca4eaba683a10c7359c580ef7744e83ca9ceb5fcb33a" exitCode=0 Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.910887 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9hsvs" event={"ID":"9bd98a0c-cdbf-437e-b488-c7c1f5c81326","Type":"ContainerDied","Data":"d43cccdc5c656da0d889ca4eaba683a10c7359c580ef7744e83ca9ceb5fcb33a"} Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.914636 4822 generic.go:334] "Generic (PLEG): container finished" podID="5f6d2357-fbc8-4a1e-92af-1da00c2d7b89" containerID="5ba2ef9118bef44e8786bbc493f68dcdd5f9957d059443a5b0396944fd668c2a" exitCode=0 Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.914713 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mz7mk" event={"ID":"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89","Type":"ContainerDied","Data":"5ba2ef9118bef44e8786bbc493f68dcdd5f9957d059443a5b0396944fd668c2a"} Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.914745 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mz7mk" event={"ID":"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89","Type":"ContainerStarted","Data":"e1433387b59d66f224a20cc9eafcf219d96b1ddaba92a3b611cfe488cacd1f1a"} Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.918086 4822 generic.go:334] "Generic (PLEG): container finished" podID="a9f56185-1fd1-420a-9634-e982d9644d21" containerID="52abefdc1abbbdf1d6e084d55a389e7f29cd378d32aa689238a2bdc490e4f2a0" exitCode=0 Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.918153 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gm7d6" event={"ID":"a9f56185-1fd1-420a-9634-e982d9644d21","Type":"ContainerDied","Data":"52abefdc1abbbdf1d6e084d55a389e7f29cd378d32aa689238a2bdc490e4f2a0"} Oct 10 06:42:08 crc kubenswrapper[4822]: I1010 06:42:08.983543 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.010309 4822 scope.go:117] "RemoveContainer" containerID="63cef2f5ec36bd1ed3cedc913717949be7ff69a700f445cdfd7a48d68f6b791a" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.012972 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w76zw"] Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.018474 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w76zw"] Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.186917 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:09 crc kubenswrapper[4822]: E1010 06:42:09.187138 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:42:09 crc kubenswrapper[4822]: E1010 06:42:09.187169 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 06:42:09 crc kubenswrapper[4822]: E1010 06:42:09.187259 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:42:13.187235277 +0000 UTC m=+1080.282393483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : configmap "swift-ring-files" not found Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.343645 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8n425"] Oct 10 06:42:09 crc kubenswrapper[4822]: E1010 06:42:09.344066 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="dnsmasq-dns" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.344086 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="dnsmasq-dns" Oct 10 06:42:09 crc kubenswrapper[4822]: E1010 06:42:09.344113 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="init" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.344122 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="init" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.344317 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" containerName="dnsmasq-dns" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.344961 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.347015 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.347343 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.348945 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.393265 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8n425"] Oct 10 06:42:09 crc kubenswrapper[4822]: E1010 06:42:09.393841 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-kvzt6 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-kvzt6 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-8n425" podUID="093ec15c-5e70-48a9-a00f-c010e1f9ee34" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.399739 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5m8q5"] Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.400702 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.413943 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5m8q5"] Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.423418 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8n425"] Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.492951 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/093ec15c-5e70-48a9-a00f-c010e1f9ee34-etc-swift\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493011 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-dispersionconf\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493036 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-swiftconf\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493056 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-dispersionconf\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493689 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6gwt\" (UniqueName: \"kubernetes.io/projected/1006eabf-bc60-449e-9650-5f2e2969f08c-kube-api-access-k6gwt\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493757 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-ring-data-devices\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493784 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvzt6\" (UniqueName: \"kubernetes.io/projected/093ec15c-5e70-48a9-a00f-c010e1f9ee34-kube-api-access-kvzt6\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493821 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-scripts\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493840 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-combined-ca-bundle\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493934 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1006eabf-bc60-449e-9650-5f2e2969f08c-etc-swift\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.493993 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-combined-ca-bundle\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.494010 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-scripts\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.494030 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-ring-data-devices\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.494081 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-swiftconf\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.596045 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-ring-data-devices\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.596115 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-swiftconf\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597177 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/093ec15c-5e70-48a9-a00f-c010e1f9ee34-etc-swift\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597322 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/093ec15c-5e70-48a9-a00f-c010e1f9ee34-etc-swift\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597344 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-ring-data-devices\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597448 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-dispersionconf\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597493 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-swiftconf\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597523 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-dispersionconf\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597610 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6gwt\" (UniqueName: \"kubernetes.io/projected/1006eabf-bc60-449e-9650-5f2e2969f08c-kube-api-access-k6gwt\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597649 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-ring-data-devices\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597686 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvzt6\" (UniqueName: \"kubernetes.io/projected/093ec15c-5e70-48a9-a00f-c010e1f9ee34-kube-api-access-kvzt6\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597724 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-scripts\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597762 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-combined-ca-bundle\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.597870 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1006eabf-bc60-449e-9650-5f2e2969f08c-etc-swift\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.598008 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-combined-ca-bundle\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.598058 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-scripts\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.599090 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-scripts\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.601393 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1006eabf-bc60-449e-9650-5f2e2969f08c-etc-swift\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.602353 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-ring-data-devices\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.602537 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-swiftconf\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.602701 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-scripts\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.607887 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-combined-ca-bundle\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.609228 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-dispersionconf\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.614660 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-swiftconf\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.615387 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-combined-ca-bundle\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.616235 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-dispersionconf\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.619744 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvzt6\" (UniqueName: \"kubernetes.io/projected/093ec15c-5e70-48a9-a00f-c010e1f9ee34-kube-api-access-kvzt6\") pod \"swift-ring-rebalance-8n425\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.638058 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6gwt\" (UniqueName: \"kubernetes.io/projected/1006eabf-bc60-449e-9650-5f2e2969f08c-kube-api-access-k6gwt\") pod \"swift-ring-rebalance-5m8q5\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.663265 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d90dd9-d39e-4d83-a22a-2fdf5acfc03f" path="/var/lib/kubelet/pods/19d90dd9-d39e-4d83-a22a-2fdf5acfc03f/volumes" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.717949 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.931234 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2x4k7" event={"ID":"e6ea6247-cd17-4b6e-b71f-b916243da4b6","Type":"ContainerStarted","Data":"5233ce6a762e0754c515f119b497b8ddc51c7df347fc9453f4cfb86c35c752f7"} Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.931366 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.943664 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:09 crc kubenswrapper[4822]: I1010 06:42:09.953231 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-2x4k7" podStartSLOduration=5.9532137160000005 podStartE2EDuration="5.953213716s" podCreationTimestamp="2025-10-10 06:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:09.949088026 +0000 UTC m=+1077.044246232" watchObservedRunningTime="2025-10-10 06:42:09.953213716 +0000 UTC m=+1077.048371912" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116344 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-swiftconf\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116401 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-dispersionconf\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116438 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/093ec15c-5e70-48a9-a00f-c010e1f9ee34-etc-swift\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116547 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-ring-data-devices\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116576 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-combined-ca-bundle\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116652 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-scripts\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.116718 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvzt6\" (UniqueName: \"kubernetes.io/projected/093ec15c-5e70-48a9-a00f-c010e1f9ee34-kube-api-access-kvzt6\") pod \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\" (UID: \"093ec15c-5e70-48a9-a00f-c010e1f9ee34\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.122635 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093ec15c-5e70-48a9-a00f-c010e1f9ee34-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.126277 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.126382 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093ec15c-5e70-48a9-a00f-c010e1f9ee34-kube-api-access-kvzt6" (OuterVolumeSpecName: "kube-api-access-kvzt6") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "kube-api-access-kvzt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.126539 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.129272 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.129470 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-scripts" (OuterVolumeSpecName: "scripts") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.136587 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093ec15c-5e70-48a9-a00f-c010e1f9ee34" (UID: "093ec15c-5e70-48a9-a00f-c010e1f9ee34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.148133 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5m8q5"] Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.221893 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvzt6\" (UniqueName: \"kubernetes.io/projected/093ec15c-5e70-48a9-a00f-c010e1f9ee34-kube-api-access-kvzt6\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.222128 4822 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.222140 4822 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.222148 4822 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/093ec15c-5e70-48a9-a00f-c010e1f9ee34-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.222158 4822 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.222166 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093ec15c-5e70-48a9-a00f-c010e1f9ee34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.222174 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/093ec15c-5e70-48a9-a00f-c010e1f9ee34-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.276196 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.340820 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.343113 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.425455 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpr9\" (UniqueName: \"kubernetes.io/projected/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89-kube-api-access-8vpr9\") pod \"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89\" (UID: \"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.428844 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89-kube-api-access-8vpr9" (OuterVolumeSpecName: "kube-api-access-8vpr9") pod "5f6d2357-fbc8-4a1e-92af-1da00c2d7b89" (UID: "5f6d2357-fbc8-4a1e-92af-1da00c2d7b89"). InnerVolumeSpecName "kube-api-access-8vpr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.526980 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qxjk\" (UniqueName: \"kubernetes.io/projected/9bd98a0c-cdbf-437e-b488-c7c1f5c81326-kube-api-access-6qxjk\") pod \"9bd98a0c-cdbf-437e-b488-c7c1f5c81326\" (UID: \"9bd98a0c-cdbf-437e-b488-c7c1f5c81326\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.527185 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9g9c\" (UniqueName: \"kubernetes.io/projected/a9f56185-1fd1-420a-9634-e982d9644d21-kube-api-access-b9g9c\") pod \"a9f56185-1fd1-420a-9634-e982d9644d21\" (UID: \"a9f56185-1fd1-420a-9634-e982d9644d21\") " Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.527593 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpr9\" (UniqueName: \"kubernetes.io/projected/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89-kube-api-access-8vpr9\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.530472 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd98a0c-cdbf-437e-b488-c7c1f5c81326-kube-api-access-6qxjk" (OuterVolumeSpecName: "kube-api-access-6qxjk") pod "9bd98a0c-cdbf-437e-b488-c7c1f5c81326" (UID: "9bd98a0c-cdbf-437e-b488-c7c1f5c81326"). InnerVolumeSpecName "kube-api-access-6qxjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.531295 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f56185-1fd1-420a-9634-e982d9644d21-kube-api-access-b9g9c" (OuterVolumeSpecName: "kube-api-access-b9g9c") pod "a9f56185-1fd1-420a-9634-e982d9644d21" (UID: "a9f56185-1fd1-420a-9634-e982d9644d21"). InnerVolumeSpecName "kube-api-access-b9g9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.628708 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9g9c\" (UniqueName: \"kubernetes.io/projected/a9f56185-1fd1-420a-9634-e982d9644d21-kube-api-access-b9g9c\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.628739 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qxjk\" (UniqueName: \"kubernetes.io/projected/9bd98a0c-cdbf-437e-b488-c7c1f5c81326-kube-api-access-6qxjk\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.964419 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9hsvs" event={"ID":"9bd98a0c-cdbf-437e-b488-c7c1f5c81326","Type":"ContainerDied","Data":"601721c9b4805b8f0166f0d582a93d7a241b2771897872a25298d0a4737c6f71"} Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.964475 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601721c9b4805b8f0166f0d582a93d7a241b2771897872a25298d0a4737c6f71" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.965768 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9hsvs" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.972200 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mz7mk" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.972237 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mz7mk" event={"ID":"5f6d2357-fbc8-4a1e-92af-1da00c2d7b89","Type":"ContainerDied","Data":"e1433387b59d66f224a20cc9eafcf219d96b1ddaba92a3b611cfe488cacd1f1a"} Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.972293 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1433387b59d66f224a20cc9eafcf219d96b1ddaba92a3b611cfe488cacd1f1a" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.975636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5m8q5" event={"ID":"1006eabf-bc60-449e-9650-5f2e2969f08c","Type":"ContainerStarted","Data":"4c7c697c186b76d31361a9f5ff2ec2db01d68df539d39cbaa94f814a14874f26"} Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.979729 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gm7d6" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.979742 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gm7d6" event={"ID":"a9f56185-1fd1-420a-9634-e982d9644d21","Type":"ContainerDied","Data":"0c91f97caf619d60182504a8a5c92286ab365f9afd469a6ef1c3a41a17fd716c"} Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.979774 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c91f97caf619d60182504a8a5c92286ab365f9afd469a6ef1c3a41a17fd716c" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.980026 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8n425" Oct 10 06:42:10 crc kubenswrapper[4822]: I1010 06:42:10.980050 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:11 crc kubenswrapper[4822]: I1010 06:42:11.029204 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8n425"] Oct 10 06:42:11 crc kubenswrapper[4822]: I1010 06:42:11.044237 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-8n425"] Oct 10 06:42:11 crc kubenswrapper[4822]: I1010 06:42:11.658981 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093ec15c-5e70-48a9-a00f-c010e1f9ee34" path="/var/lib/kubelet/pods/093ec15c-5e70-48a9-a00f-c010e1f9ee34/volumes" Oct 10 06:42:12 crc kubenswrapper[4822]: I1010 06:42:12.717780 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 10 06:42:13 crc kubenswrapper[4822]: I1010 06:42:13.207154 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:13 crc kubenswrapper[4822]: E1010 06:42:13.207335 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:42:13 crc kubenswrapper[4822]: E1010 06:42:13.207377 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 06:42:13 crc kubenswrapper[4822]: E1010 06:42:13.207458 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:42:21.207416678 +0000 UTC m=+1088.302574874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : configmap "swift-ring-files" not found Oct 10 06:42:14 crc kubenswrapper[4822]: I1010 06:42:14.711081 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:42:14 crc kubenswrapper[4822]: I1010 06:42:14.790320 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-96q8h"] Oct 10 06:42:14 crc kubenswrapper[4822]: I1010 06:42:14.790549 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerName="dnsmasq-dns" containerID="cri-o://ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f" gracePeriod=10 Oct 10 06:42:15 crc kubenswrapper[4822]: I1010 06:42:15.933056 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.034612 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5m8q5" event={"ID":"1006eabf-bc60-449e-9650-5f2e2969f08c","Type":"ContainerStarted","Data":"ff8720c0716057614e5e35ff1ba277e8d17564bb9d481e502d8d5c8094332b53"} Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.037539 4822 generic.go:334] "Generic (PLEG): container finished" podID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerID="ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f" exitCode=0 Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.037576 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" event={"ID":"9f6eb39a-30d8-4484-b987-25fa4582ab89","Type":"ContainerDied","Data":"ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f"} Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.037598 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" event={"ID":"9f6eb39a-30d8-4484-b987-25fa4582ab89","Type":"ContainerDied","Data":"b2992015c5eb4cf036615aedca7aae7069637cbac09b93eeaa1350ff36e35de3"} Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.037619 4822 scope.go:117] "RemoveContainer" containerID="ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.037635 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-96q8h" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.056973 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-5m8q5" podStartSLOduration=1.895976867 podStartE2EDuration="7.056956709s" podCreationTimestamp="2025-10-10 06:42:09 +0000 UTC" firstStartedPulling="2025-10-10 06:42:10.183701123 +0000 UTC m=+1077.278859319" lastFinishedPulling="2025-10-10 06:42:15.344680965 +0000 UTC m=+1082.439839161" observedRunningTime="2025-10-10 06:42:16.054112186 +0000 UTC m=+1083.149270372" watchObservedRunningTime="2025-10-10 06:42:16.056956709 +0000 UTC m=+1083.152114905" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.057755 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-config\") pod \"9f6eb39a-30d8-4484-b987-25fa4582ab89\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.057858 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd498\" (UniqueName: \"kubernetes.io/projected/9f6eb39a-30d8-4484-b987-25fa4582ab89-kube-api-access-vd498\") pod \"9f6eb39a-30d8-4484-b987-25fa4582ab89\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.057946 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-dns-svc\") pod \"9f6eb39a-30d8-4484-b987-25fa4582ab89\" (UID: \"9f6eb39a-30d8-4484-b987-25fa4582ab89\") " Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.058466 4822 scope.go:117] "RemoveContainer" containerID="2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.065319 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6eb39a-30d8-4484-b987-25fa4582ab89-kube-api-access-vd498" (OuterVolumeSpecName: "kube-api-access-vd498") pod "9f6eb39a-30d8-4484-b987-25fa4582ab89" (UID: "9f6eb39a-30d8-4484-b987-25fa4582ab89"). InnerVolumeSpecName "kube-api-access-vd498". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.095817 4822 scope.go:117] "RemoveContainer" containerID="ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f" Oct 10 06:42:16 crc kubenswrapper[4822]: E1010 06:42:16.096395 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f\": container with ID starting with ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f not found: ID does not exist" containerID="ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.096437 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f"} err="failed to get container status \"ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f\": rpc error: code = NotFound desc = could not find container \"ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f\": container with ID starting with ec54ad9ec618fb3d5fef8cbccc2051c3cf270b02b05b43c84a4cae8a7a586b9f not found: ID does not exist" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.096461 4822 scope.go:117] "RemoveContainer" containerID="2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c" Oct 10 06:42:16 crc kubenswrapper[4822]: E1010 06:42:16.097110 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c\": container with ID starting with 2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c not found: ID does not exist" containerID="2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.097149 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c"} err="failed to get container status \"2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c\": rpc error: code = NotFound desc = could not find container \"2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c\": container with ID starting with 2f2b59efd47f52b1bba2b8e509cd4fd4d4aa492314a4164ea8ecbbfc3eae803c not found: ID does not exist" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.098704 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f6eb39a-30d8-4484-b987-25fa4582ab89" (UID: "9f6eb39a-30d8-4484-b987-25fa4582ab89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.112140 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-config" (OuterVolumeSpecName: "config") pod "9f6eb39a-30d8-4484-b987-25fa4582ab89" (UID: "9f6eb39a-30d8-4484-b987-25fa4582ab89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.161387 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.161433 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd498\" (UniqueName: \"kubernetes.io/projected/9f6eb39a-30d8-4484-b987-25fa4582ab89-kube-api-access-vd498\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.161458 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f6eb39a-30d8-4484-b987-25fa4582ab89-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.387301 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-96q8h"] Oct 10 06:42:16 crc kubenswrapper[4822]: I1010 06:42:16.398480 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-96q8h"] Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.396962 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d814-account-create-src5c"] Oct 10 06:42:17 crc kubenswrapper[4822]: E1010 06:42:17.397525 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f56185-1fd1-420a-9634-e982d9644d21" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.397543 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f56185-1fd1-420a-9634-e982d9644d21" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: E1010 06:42:17.397560 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerName="init" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.397570 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerName="init" Oct 10 06:42:17 crc kubenswrapper[4822]: E1010 06:42:17.397594 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd98a0c-cdbf-437e-b488-c7c1f5c81326" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.397602 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd98a0c-cdbf-437e-b488-c7c1f5c81326" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: E1010 06:42:17.397621 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerName="dnsmasq-dns" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.397640 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerName="dnsmasq-dns" Oct 10 06:42:17 crc kubenswrapper[4822]: E1010 06:42:17.397658 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6d2357-fbc8-4a1e-92af-1da00c2d7b89" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.397670 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6d2357-fbc8-4a1e-92af-1da00c2d7b89" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.398011 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" containerName="dnsmasq-dns" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.398034 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f56185-1fd1-420a-9634-e982d9644d21" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.398045 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd98a0c-cdbf-437e-b488-c7c1f5c81326" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.398056 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6d2357-fbc8-4a1e-92af-1da00c2d7b89" containerName="mariadb-database-create" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.399606 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.401778 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.408670 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d814-account-create-src5c"] Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.490169 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp65d\" (UniqueName: \"kubernetes.io/projected/4608adcf-203f-4314-a574-18318c584d21-kube-api-access-zp65d\") pod \"glance-d814-account-create-src5c\" (UID: \"4608adcf-203f-4314-a574-18318c584d21\") " pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.592595 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp65d\" (UniqueName: \"kubernetes.io/projected/4608adcf-203f-4314-a574-18318c584d21-kube-api-access-zp65d\") pod \"glance-d814-account-create-src5c\" (UID: \"4608adcf-203f-4314-a574-18318c584d21\") " pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.633370 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp65d\" (UniqueName: \"kubernetes.io/projected/4608adcf-203f-4314-a574-18318c584d21-kube-api-access-zp65d\") pod \"glance-d814-account-create-src5c\" (UID: \"4608adcf-203f-4314-a574-18318c584d21\") " pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.673622 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6eb39a-30d8-4484-b987-25fa4582ab89" path="/var/lib/kubelet/pods/9f6eb39a-30d8-4484-b987-25fa4582ab89/volumes" Oct 10 06:42:17 crc kubenswrapper[4822]: I1010 06:42:17.725985 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:18 crc kubenswrapper[4822]: I1010 06:42:18.192876 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d814-account-create-src5c"] Oct 10 06:42:18 crc kubenswrapper[4822]: W1010 06:42:18.195312 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4608adcf_203f_4314_a574_18318c584d21.slice/crio-1b8c5a935ca72d6c5ec4c01d809c1ec34971195a3e2a2d0dc982e5f0a1ac3724 WatchSource:0}: Error finding container 1b8c5a935ca72d6c5ec4c01d809c1ec34971195a3e2a2d0dc982e5f0a1ac3724: Status 404 returned error can't find the container with id 1b8c5a935ca72d6c5ec4c01d809c1ec34971195a3e2a2d0dc982e5f0a1ac3724 Oct 10 06:42:19 crc kubenswrapper[4822]: I1010 06:42:19.070095 4822 generic.go:334] "Generic (PLEG): container finished" podID="4608adcf-203f-4314-a574-18318c584d21" containerID="d7ad001d0d4b160f08d794d8ff2e892d3b85e7807420a3abded31fb954c6d8ca" exitCode=0 Oct 10 06:42:19 crc kubenswrapper[4822]: I1010 06:42:19.070194 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d814-account-create-src5c" event={"ID":"4608adcf-203f-4314-a574-18318c584d21","Type":"ContainerDied","Data":"d7ad001d0d4b160f08d794d8ff2e892d3b85e7807420a3abded31fb954c6d8ca"} Oct 10 06:42:19 crc kubenswrapper[4822]: I1010 06:42:19.070383 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d814-account-create-src5c" event={"ID":"4608adcf-203f-4314-a574-18318c584d21","Type":"ContainerStarted","Data":"1b8c5a935ca72d6c5ec4c01d809c1ec34971195a3e2a2d0dc982e5f0a1ac3724"} Oct 10 06:42:20 crc kubenswrapper[4822]: I1010 06:42:20.449242 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:20 crc kubenswrapper[4822]: I1010 06:42:20.545370 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp65d\" (UniqueName: \"kubernetes.io/projected/4608adcf-203f-4314-a574-18318c584d21-kube-api-access-zp65d\") pod \"4608adcf-203f-4314-a574-18318c584d21\" (UID: \"4608adcf-203f-4314-a574-18318c584d21\") " Oct 10 06:42:20 crc kubenswrapper[4822]: I1010 06:42:20.552117 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4608adcf-203f-4314-a574-18318c584d21-kube-api-access-zp65d" (OuterVolumeSpecName: "kube-api-access-zp65d") pod "4608adcf-203f-4314-a574-18318c584d21" (UID: "4608adcf-203f-4314-a574-18318c584d21"). InnerVolumeSpecName "kube-api-access-zp65d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:20 crc kubenswrapper[4822]: I1010 06:42:20.647523 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp65d\" (UniqueName: \"kubernetes.io/projected/4608adcf-203f-4314-a574-18318c584d21-kube-api-access-zp65d\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.093695 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d814-account-create-src5c" event={"ID":"4608adcf-203f-4314-a574-18318c584d21","Type":"ContainerDied","Data":"1b8c5a935ca72d6c5ec4c01d809c1ec34971195a3e2a2d0dc982e5f0a1ac3724"} Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.094121 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b8c5a935ca72d6c5ec4c01d809c1ec34971195a3e2a2d0dc982e5f0a1ac3724" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.093795 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d814-account-create-src5c" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.257399 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:21 crc kubenswrapper[4822]: E1010 06:42:21.257608 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:42:21 crc kubenswrapper[4822]: E1010 06:42:21.257638 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 06:42:21 crc kubenswrapper[4822]: E1010 06:42:21.257719 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:42:37.25768735 +0000 UTC m=+1104.352845566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : configmap "swift-ring-files" not found Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.840168 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a920-account-create-kxwvs"] Oct 10 06:42:21 crc kubenswrapper[4822]: E1010 06:42:21.841606 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4608adcf-203f-4314-a574-18318c584d21" containerName="mariadb-account-create" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.841693 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4608adcf-203f-4314-a574-18318c584d21" containerName="mariadb-account-create" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.841963 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4608adcf-203f-4314-a574-18318c584d21" containerName="mariadb-account-create" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.842541 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.846506 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a920-account-create-kxwvs"] Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.846726 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 10 06:42:21 crc kubenswrapper[4822]: I1010 06:42:21.970366 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c55m\" (UniqueName: \"kubernetes.io/projected/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6-kube-api-access-8c55m\") pod \"keystone-a920-account-create-kxwvs\" (UID: \"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6\") " pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.028376 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-16c0-account-create-x24p8"] Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.029588 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.032317 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.036568 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-16c0-account-create-x24p8"] Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.072509 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c55m\" (UniqueName: \"kubernetes.io/projected/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6-kube-api-access-8c55m\") pod \"keystone-a920-account-create-kxwvs\" (UID: \"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6\") " pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.095420 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c55m\" (UniqueName: \"kubernetes.io/projected/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6-kube-api-access-8c55m\") pod \"keystone-a920-account-create-kxwvs\" (UID: \"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6\") " pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.102420 4822 generic.go:334] "Generic (PLEG): container finished" podID="1006eabf-bc60-449e-9650-5f2e2969f08c" containerID="ff8720c0716057614e5e35ff1ba277e8d17564bb9d481e502d8d5c8094332b53" exitCode=0 Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.102457 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5m8q5" event={"ID":"1006eabf-bc60-449e-9650-5f2e2969f08c","Type":"ContainerDied","Data":"ff8720c0716057614e5e35ff1ba277e8d17564bb9d481e502d8d5c8094332b53"} Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.174364 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kkw\" (UniqueName: \"kubernetes.io/projected/9a9d6eca-faa2-4f53-978f-a547fd3fc131-kube-api-access-67kkw\") pod \"placement-16c0-account-create-x24p8\" (UID: \"9a9d6eca-faa2-4f53-978f-a547fd3fc131\") " pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.184079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.276455 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kkw\" (UniqueName: \"kubernetes.io/projected/9a9d6eca-faa2-4f53-978f-a547fd3fc131-kube-api-access-67kkw\") pod \"placement-16c0-account-create-x24p8\" (UID: \"9a9d6eca-faa2-4f53-978f-a547fd3fc131\") " pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.312079 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kkw\" (UniqueName: \"kubernetes.io/projected/9a9d6eca-faa2-4f53-978f-a547fd3fc131-kube-api-access-67kkw\") pod \"placement-16c0-account-create-x24p8\" (UID: \"9a9d6eca-faa2-4f53-978f-a547fd3fc131\") " pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.345918 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.568369 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hzr4f"] Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.569739 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.572584 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.572974 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-twbbx" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.578499 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hzr4f"] Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.657100 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a920-account-create-kxwvs"] Oct 10 06:42:22 crc kubenswrapper[4822]: W1010 06:42:22.660268 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea8fdf5_e09c_4fb6_bc16_2d53bb95f0d6.slice/crio-4e4e1dc34b16b227d7cfc3706f071982342d510f839fab0dea967406d12a403c WatchSource:0}: Error finding container 4e4e1dc34b16b227d7cfc3706f071982342d510f839fab0dea967406d12a403c: Status 404 returned error can't find the container with id 4e4e1dc34b16b227d7cfc3706f071982342d510f839fab0dea967406d12a403c Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.686183 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-db-sync-config-data\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.686246 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5g2\" (UniqueName: \"kubernetes.io/projected/c6a3f897-e51e-4ecf-b173-e25d2f000a07-kube-api-access-5j5g2\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.686366 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-combined-ca-bundle\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.686420 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-config-data\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.787542 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-combined-ca-bundle\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.787979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-config-data\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.788096 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-db-sync-config-data\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.788183 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5g2\" (UniqueName: \"kubernetes.io/projected/c6a3f897-e51e-4ecf-b173-e25d2f000a07-kube-api-access-5j5g2\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.793667 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-db-sync-config-data\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.793754 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-combined-ca-bundle\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.801034 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-config-data\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.804831 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5g2\" (UniqueName: \"kubernetes.io/projected/c6a3f897-e51e-4ecf-b173-e25d2f000a07-kube-api-access-5j5g2\") pod \"glance-db-sync-hzr4f\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.806620 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-16c0-account-create-x24p8"] Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.900313 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.936895 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5jdbx" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" probeResult="failure" output=< Oct 10 06:42:22 crc kubenswrapper[4822]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 06:42:22 crc kubenswrapper[4822]: > Oct 10 06:42:22 crc kubenswrapper[4822]: I1010 06:42:22.966446 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.116069 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-16c0-account-create-x24p8" event={"ID":"9a9d6eca-faa2-4f53-978f-a547fd3fc131","Type":"ContainerStarted","Data":"dcb45fed241501e193ccf3f61f716249836d1f162c81bc32a9c507e3c8334169"} Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.116333 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-16c0-account-create-x24p8" event={"ID":"9a9d6eca-faa2-4f53-978f-a547fd3fc131","Type":"ContainerStarted","Data":"1f3c5e7d179b2e424042f225fbe6ae91480e37df9527c886da85c81a5b02edc6"} Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.118622 4822 generic.go:334] "Generic (PLEG): container finished" podID="0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6" containerID="ebfd8d72e608053f57e097ccab816b8741ef0bdde0a682093445b6f0a526d64a" exitCode=0 Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.119365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a920-account-create-kxwvs" event={"ID":"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6","Type":"ContainerDied","Data":"ebfd8d72e608053f57e097ccab816b8741ef0bdde0a682093445b6f0a526d64a"} Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.119441 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a920-account-create-kxwvs" event={"ID":"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6","Type":"ContainerStarted","Data":"4e4e1dc34b16b227d7cfc3706f071982342d510f839fab0dea967406d12a403c"} Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.427277 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hzr4f"] Oct 10 06:42:23 crc kubenswrapper[4822]: W1010 06:42:23.435200 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6a3f897_e51e_4ecf_b173_e25d2f000a07.slice/crio-3fb2a3432d6d3bc61103241490b854707ecae637ff641b11971e6aa2344faefd WatchSource:0}: Error finding container 3fb2a3432d6d3bc61103241490b854707ecae637ff641b11971e6aa2344faefd: Status 404 returned error can't find the container with id 3fb2a3432d6d3bc61103241490b854707ecae637ff641b11971e6aa2344faefd Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.486785 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602278 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-swiftconf\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602336 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-dispersionconf\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602363 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-scripts\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602389 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-combined-ca-bundle\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602418 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6gwt\" (UniqueName: \"kubernetes.io/projected/1006eabf-bc60-449e-9650-5f2e2969f08c-kube-api-access-k6gwt\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602568 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-ring-data-devices\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.602668 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1006eabf-bc60-449e-9650-5f2e2969f08c-etc-swift\") pod \"1006eabf-bc60-449e-9650-5f2e2969f08c\" (UID: \"1006eabf-bc60-449e-9650-5f2e2969f08c\") " Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.603334 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.603516 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1006eabf-bc60-449e-9650-5f2e2969f08c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.608635 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1006eabf-bc60-449e-9650-5f2e2969f08c-kube-api-access-k6gwt" (OuterVolumeSpecName: "kube-api-access-k6gwt") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "kube-api-access-k6gwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.611948 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.622046 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-scripts" (OuterVolumeSpecName: "scripts") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.628965 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.631541 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1006eabf-bc60-449e-9650-5f2e2969f08c" (UID: "1006eabf-bc60-449e-9650-5f2e2969f08c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705057 4822 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705099 4822 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1006eabf-bc60-449e-9650-5f2e2969f08c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705115 4822 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705126 4822 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705138 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1006eabf-bc60-449e-9650-5f2e2969f08c-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705151 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1006eabf-bc60-449e-9650-5f2e2969f08c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:23 crc kubenswrapper[4822]: I1010 06:42:23.705165 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6gwt\" (UniqueName: \"kubernetes.io/projected/1006eabf-bc60-449e-9650-5f2e2969f08c-kube-api-access-k6gwt\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.130982 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hzr4f" event={"ID":"c6a3f897-e51e-4ecf-b173-e25d2f000a07","Type":"ContainerStarted","Data":"3fb2a3432d6d3bc61103241490b854707ecae637ff641b11971e6aa2344faefd"} Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.136127 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5m8q5" event={"ID":"1006eabf-bc60-449e-9650-5f2e2969f08c","Type":"ContainerDied","Data":"4c7c697c186b76d31361a9f5ff2ec2db01d68df539d39cbaa94f814a14874f26"} Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.136194 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5m8q5" Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.136222 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7c697c186b76d31361a9f5ff2ec2db01d68df539d39cbaa94f814a14874f26" Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.140745 4822 generic.go:334] "Generic (PLEG): container finished" podID="9a9d6eca-faa2-4f53-978f-a547fd3fc131" containerID="dcb45fed241501e193ccf3f61f716249836d1f162c81bc32a9c507e3c8334169" exitCode=0 Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.140930 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-16c0-account-create-x24p8" event={"ID":"9a9d6eca-faa2-4f53-978f-a547fd3fc131","Type":"ContainerDied","Data":"dcb45fed241501e193ccf3f61f716249836d1f162c81bc32a9c507e3c8334169"} Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.476640 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.620717 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c55m\" (UniqueName: \"kubernetes.io/projected/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6-kube-api-access-8c55m\") pod \"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6\" (UID: \"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6\") " Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.624833 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6-kube-api-access-8c55m" (OuterVolumeSpecName: "kube-api-access-8c55m") pod "0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6" (UID: "0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6"). InnerVolumeSpecName "kube-api-access-8c55m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:24 crc kubenswrapper[4822]: I1010 06:42:24.723178 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c55m\" (UniqueName: \"kubernetes.io/projected/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6-kube-api-access-8c55m\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.148859 4822 generic.go:334] "Generic (PLEG): container finished" podID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerID="32c5a9d6afcaeb38dc9800deb4a9ed3a02f085527940c8caf49522b1f31a6f55" exitCode=0 Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.148914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48fba34a-0289-41f0-b1d7-bb71a22253a3","Type":"ContainerDied","Data":"32c5a9d6afcaeb38dc9800deb4a9ed3a02f085527940c8caf49522b1f31a6f55"} Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.150838 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a920-account-create-kxwvs" event={"ID":"0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6","Type":"ContainerDied","Data":"4e4e1dc34b16b227d7cfc3706f071982342d510f839fab0dea967406d12a403c"} Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.150912 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e4e1dc34b16b227d7cfc3706f071982342d510f839fab0dea967406d12a403c" Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.150938 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a920-account-create-kxwvs" Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.153287 4822 generic.go:334] "Generic (PLEG): container finished" podID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerID="eea57be423b28bbdd8158cba171eb7c5cb45f3b6969aa019c1c3bb74047f15a3" exitCode=0 Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.153560 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1fa59157-6b4e-4379-89e0-415e74c581a8","Type":"ContainerDied","Data":"eea57be423b28bbdd8158cba171eb7c5cb45f3b6969aa019c1c3bb74047f15a3"} Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.437668 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.537525 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67kkw\" (UniqueName: \"kubernetes.io/projected/9a9d6eca-faa2-4f53-978f-a547fd3fc131-kube-api-access-67kkw\") pod \"9a9d6eca-faa2-4f53-978f-a547fd3fc131\" (UID: \"9a9d6eca-faa2-4f53-978f-a547fd3fc131\") " Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.560441 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9d6eca-faa2-4f53-978f-a547fd3fc131-kube-api-access-67kkw" (OuterVolumeSpecName: "kube-api-access-67kkw") pod "9a9d6eca-faa2-4f53-978f-a547fd3fc131" (UID: "9a9d6eca-faa2-4f53-978f-a547fd3fc131"). InnerVolumeSpecName "kube-api-access-67kkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:25 crc kubenswrapper[4822]: I1010 06:42:25.648961 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67kkw\" (UniqueName: \"kubernetes.io/projected/9a9d6eca-faa2-4f53-978f-a547fd3fc131-kube-api-access-67kkw\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.162005 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-16c0-account-create-x24p8" event={"ID":"9a9d6eca-faa2-4f53-978f-a547fd3fc131","Type":"ContainerDied","Data":"1f3c5e7d179b2e424042f225fbe6ae91480e37df9527c886da85c81a5b02edc6"} Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.162997 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f3c5e7d179b2e424042f225fbe6ae91480e37df9527c886da85c81a5b02edc6" Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.163134 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-16c0-account-create-x24p8" Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.169543 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1fa59157-6b4e-4379-89e0-415e74c581a8","Type":"ContainerStarted","Data":"a829a2721fe99b524e5ca7cb2318d1332bb2968f94f80cd142fd5a74891aa843"} Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.170454 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.172673 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48fba34a-0289-41f0-b1d7-bb71a22253a3","Type":"ContainerStarted","Data":"65212beb044396c22b0fae65dc35f02089f6f4279e19167d0de48c69116a6853"} Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.173002 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.198704 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.023080735 podStartE2EDuration="59.198687321s" podCreationTimestamp="2025-10-10 06:41:27 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.718673016 +0000 UTC m=+1049.813831212" lastFinishedPulling="2025-10-10 06:41:49.894279602 +0000 UTC m=+1056.989437798" observedRunningTime="2025-10-10 06:42:26.190787541 +0000 UTC m=+1093.285945757" watchObservedRunningTime="2025-10-10 06:42:26.198687321 +0000 UTC m=+1093.293845517" Oct 10 06:42:26 crc kubenswrapper[4822]: I1010 06:42:26.225244 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.188541257 podStartE2EDuration="58.225228245s" podCreationTimestamp="2025-10-10 06:41:28 +0000 UTC" firstStartedPulling="2025-10-10 06:41:42.718394018 +0000 UTC m=+1049.813552224" lastFinishedPulling="2025-10-10 06:41:49.755081016 +0000 UTC m=+1056.850239212" observedRunningTime="2025-10-10 06:42:26.219959221 +0000 UTC m=+1093.315117437" watchObservedRunningTime="2025-10-10 06:42:26.225228245 +0000 UTC m=+1093.320386441" Oct 10 06:42:27 crc kubenswrapper[4822]: I1010 06:42:27.933723 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5jdbx" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" probeResult="failure" output=< Oct 10 06:42:27 crc kubenswrapper[4822]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 06:42:27 crc kubenswrapper[4822]: > Oct 10 06:42:27 crc kubenswrapper[4822]: I1010 06:42:27.974294 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.168309 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5jdbx-config-hm6cm"] Oct 10 06:42:28 crc kubenswrapper[4822]: E1010 06:42:28.168622 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9d6eca-faa2-4f53-978f-a547fd3fc131" containerName="mariadb-account-create" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.168633 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9d6eca-faa2-4f53-978f-a547fd3fc131" containerName="mariadb-account-create" Oct 10 06:42:28 crc kubenswrapper[4822]: E1010 06:42:28.168642 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1006eabf-bc60-449e-9650-5f2e2969f08c" containerName="swift-ring-rebalance" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.168649 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1006eabf-bc60-449e-9650-5f2e2969f08c" containerName="swift-ring-rebalance" Oct 10 06:42:28 crc kubenswrapper[4822]: E1010 06:42:28.168664 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6" containerName="mariadb-account-create" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.168672 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6" containerName="mariadb-account-create" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.169177 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9d6eca-faa2-4f53-978f-a547fd3fc131" containerName="mariadb-account-create" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.169205 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1006eabf-bc60-449e-9650-5f2e2969f08c" containerName="swift-ring-rebalance" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.169213 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6" containerName="mariadb-account-create" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.169688 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.171761 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.191327 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-log-ovn\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.191409 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9v5\" (UniqueName: \"kubernetes.io/projected/1c017e40-646d-4e82-8645-c7ec1cbcd434-kube-api-access-5f9v5\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.191448 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-additional-scripts\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.191482 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.191499 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-scripts\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.191514 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run-ovn\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.227651 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jdbx-config-hm6cm"] Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.292565 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9v5\" (UniqueName: \"kubernetes.io/projected/1c017e40-646d-4e82-8645-c7ec1cbcd434-kube-api-access-5f9v5\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.292656 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-additional-scripts\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.292710 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.292734 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-scripts\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.292755 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run-ovn\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.292867 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-log-ovn\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.293251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-log-ovn\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.293311 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.293892 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run-ovn\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.294187 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-additional-scripts\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.296544 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-scripts\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.332177 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9v5\" (UniqueName: \"kubernetes.io/projected/1c017e40-646d-4e82-8645-c7ec1cbcd434-kube-api-access-5f9v5\") pod \"ovn-controller-5jdbx-config-hm6cm\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:28 crc kubenswrapper[4822]: I1010 06:42:28.493058 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:31 crc kubenswrapper[4822]: I1010 06:42:31.338545 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:42:31 crc kubenswrapper[4822]: I1010 06:42:31.338940 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:42:32 crc kubenswrapper[4822]: I1010 06:42:32.937480 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5jdbx" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" probeResult="failure" output=< Oct 10 06:42:32 crc kubenswrapper[4822]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 06:42:32 crc kubenswrapper[4822]: > Oct 10 06:42:34 crc kubenswrapper[4822]: I1010 06:42:34.646741 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jdbx-config-hm6cm"] Oct 10 06:42:34 crc kubenswrapper[4822]: W1010 06:42:34.655167 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c017e40_646d_4e82_8645_c7ec1cbcd434.slice/crio-6b26bf0ad8c52ed6656ad9bf585c353fc582acfaf67f3969875502b82589b4a9 WatchSource:0}: Error finding container 6b26bf0ad8c52ed6656ad9bf585c353fc582acfaf67f3969875502b82589b4a9: Status 404 returned error can't find the container with id 6b26bf0ad8c52ed6656ad9bf585c353fc582acfaf67f3969875502b82589b4a9 Oct 10 06:42:35 crc kubenswrapper[4822]: I1010 06:42:35.258728 4822 generic.go:334] "Generic (PLEG): container finished" podID="1c017e40-646d-4e82-8645-c7ec1cbcd434" containerID="1ef82f2ec478186ad8805b22fb1086bf5ed62287bffc865e7603d99c71db4881" exitCode=0 Oct 10 06:42:35 crc kubenswrapper[4822]: I1010 06:42:35.258883 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx-config-hm6cm" event={"ID":"1c017e40-646d-4e82-8645-c7ec1cbcd434","Type":"ContainerDied","Data":"1ef82f2ec478186ad8805b22fb1086bf5ed62287bffc865e7603d99c71db4881"} Oct 10 06:42:35 crc kubenswrapper[4822]: I1010 06:42:35.259055 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx-config-hm6cm" event={"ID":"1c017e40-646d-4e82-8645-c7ec1cbcd434","Type":"ContainerStarted","Data":"6b26bf0ad8c52ed6656ad9bf585c353fc582acfaf67f3969875502b82589b4a9"} Oct 10 06:42:35 crc kubenswrapper[4822]: I1010 06:42:35.260555 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hzr4f" event={"ID":"c6a3f897-e51e-4ecf-b173-e25d2f000a07","Type":"ContainerStarted","Data":"5d027935eca48c7d38a1daa3d0aabc0470acb3d9c745f8b2626dd36d25168207"} Oct 10 06:42:35 crc kubenswrapper[4822]: I1010 06:42:35.304357 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hzr4f" podStartSLOduration=2.43118873 podStartE2EDuration="13.304333775s" podCreationTimestamp="2025-10-10 06:42:22 +0000 UTC" firstStartedPulling="2025-10-10 06:42:23.437507996 +0000 UTC m=+1090.532666192" lastFinishedPulling="2025-10-10 06:42:34.310653041 +0000 UTC m=+1101.405811237" observedRunningTime="2025-10-10 06:42:35.300939856 +0000 UTC m=+1102.396098052" watchObservedRunningTime="2025-10-10 06:42:35.304333775 +0000 UTC m=+1102.399491981" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.644408 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757307 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run-ovn\") pod \"1c017e40-646d-4e82-8645-c7ec1cbcd434\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757388 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f9v5\" (UniqueName: \"kubernetes.io/projected/1c017e40-646d-4e82-8645-c7ec1cbcd434-kube-api-access-5f9v5\") pod \"1c017e40-646d-4e82-8645-c7ec1cbcd434\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757433 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run\") pod \"1c017e40-646d-4e82-8645-c7ec1cbcd434\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757437 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1c017e40-646d-4e82-8645-c7ec1cbcd434" (UID: "1c017e40-646d-4e82-8645-c7ec1cbcd434"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757575 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-log-ovn\") pod \"1c017e40-646d-4e82-8645-c7ec1cbcd434\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757713 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-additional-scripts\") pod \"1c017e40-646d-4e82-8645-c7ec1cbcd434\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757770 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-scripts\") pod \"1c017e40-646d-4e82-8645-c7ec1cbcd434\" (UID: \"1c017e40-646d-4e82-8645-c7ec1cbcd434\") " Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757894 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1c017e40-646d-4e82-8645-c7ec1cbcd434" (UID: "1c017e40-646d-4e82-8645-c7ec1cbcd434"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.757997 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run" (OuterVolumeSpecName: "var-run") pod "1c017e40-646d-4e82-8645-c7ec1cbcd434" (UID: "1c017e40-646d-4e82-8645-c7ec1cbcd434"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.758703 4822 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.758761 4822 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.758788 4822 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c017e40-646d-4e82-8645-c7ec1cbcd434-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.759135 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1c017e40-646d-4e82-8645-c7ec1cbcd434" (UID: "1c017e40-646d-4e82-8645-c7ec1cbcd434"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.759306 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-scripts" (OuterVolumeSpecName: "scripts") pod "1c017e40-646d-4e82-8645-c7ec1cbcd434" (UID: "1c017e40-646d-4e82-8645-c7ec1cbcd434"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.763587 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c017e40-646d-4e82-8645-c7ec1cbcd434-kube-api-access-5f9v5" (OuterVolumeSpecName: "kube-api-access-5f9v5") pod "1c017e40-646d-4e82-8645-c7ec1cbcd434" (UID: "1c017e40-646d-4e82-8645-c7ec1cbcd434"). InnerVolumeSpecName "kube-api-access-5f9v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.860542 4822 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.860913 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c017e40-646d-4e82-8645-c7ec1cbcd434-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:36 crc kubenswrapper[4822]: I1010 06:42:36.860938 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f9v5\" (UniqueName: \"kubernetes.io/projected/1c017e40-646d-4e82-8645-c7ec1cbcd434-kube-api-access-5f9v5\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.266910 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.274585 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"swift-storage-0\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " pod="openstack/swift-storage-0" Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.278959 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx-config-hm6cm" event={"ID":"1c017e40-646d-4e82-8645-c7ec1cbcd434","Type":"ContainerDied","Data":"6b26bf0ad8c52ed6656ad9bf585c353fc582acfaf67f3969875502b82589b4a9"} Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.279004 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b26bf0ad8c52ed6656ad9bf585c353fc582acfaf67f3969875502b82589b4a9" Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.279018 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx-config-hm6cm" Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.495994 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.787555 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5jdbx-config-hm6cm"] Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.805212 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5jdbx-config-hm6cm"] Oct 10 06:42:37 crc kubenswrapper[4822]: I1010 06:42:37.929633 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5jdbx" Oct 10 06:42:38 crc kubenswrapper[4822]: I1010 06:42:38.099068 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 10 06:42:38 crc kubenswrapper[4822]: W1010 06:42:38.108315 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21826954_a4ea_4715_be9e_6cd8272342dc.slice/crio-2aa154c46a6bc1c78d7a6b89025950c4f006431b7f75cd5bd318e74cb583be51 WatchSource:0}: Error finding container 2aa154c46a6bc1c78d7a6b89025950c4f006431b7f75cd5bd318e74cb583be51: Status 404 returned error can't find the container with id 2aa154c46a6bc1c78d7a6b89025950c4f006431b7f75cd5bd318e74cb583be51 Oct 10 06:42:38 crc kubenswrapper[4822]: I1010 06:42:38.287898 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"2aa154c46a6bc1c78d7a6b89025950c4f006431b7f75cd5bd318e74cb583be51"} Oct 10 06:42:39 crc kubenswrapper[4822]: I1010 06:42:39.087372 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:42:39 crc kubenswrapper[4822]: I1010 06:42:39.378840 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 10 06:42:39 crc kubenswrapper[4822]: I1010 06:42:39.665820 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c017e40-646d-4e82-8645-c7ec1cbcd434" path="/var/lib/kubelet/pods/1c017e40-646d-4e82-8645-c7ec1cbcd434/volumes" Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.305068 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768"} Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.305120 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc"} Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.305132 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c"} Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.305140 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3"} Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.812044 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ntwrt"] Oct 10 06:42:40 crc kubenswrapper[4822]: E1010 06:42:40.812593 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c017e40-646d-4e82-8645-c7ec1cbcd434" containerName="ovn-config" Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.812605 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c017e40-646d-4e82-8645-c7ec1cbcd434" containerName="ovn-config" Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.812758 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c017e40-646d-4e82-8645-c7ec1cbcd434" containerName="ovn-config" Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.813523 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.824134 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ntwrt"] Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.907286 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-w2kh5"] Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.925211 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/25544312-5e94-49ce-8726-8259264add47-kube-api-access-d5c2w\") pod \"cinder-db-create-ntwrt\" (UID: \"25544312-5e94-49ce-8726-8259264add47\") " pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.940053 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w2kh5"] Oct 10 06:42:40 crc kubenswrapper[4822]: I1010 06:42:40.940192 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.026899 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwm8\" (UniqueName: \"kubernetes.io/projected/ea655c41-9403-41cc-9bd1-e362ee5af607-kube-api-access-bxwm8\") pod \"barbican-db-create-w2kh5\" (UID: \"ea655c41-9403-41cc-9bd1-e362ee5af607\") " pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.027068 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/25544312-5e94-49ce-8726-8259264add47-kube-api-access-d5c2w\") pod \"cinder-db-create-ntwrt\" (UID: \"25544312-5e94-49ce-8726-8259264add47\") " pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.059903 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/25544312-5e94-49ce-8726-8259264add47-kube-api-access-d5c2w\") pod \"cinder-db-create-ntwrt\" (UID: \"25544312-5e94-49ce-8726-8259264add47\") " pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.107738 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ngq5g"] Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.109079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.119811 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ngq5g"] Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.128359 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwm8\" (UniqueName: \"kubernetes.io/projected/ea655c41-9403-41cc-9bd1-e362ee5af607-kube-api-access-bxwm8\") pod \"barbican-db-create-w2kh5\" (UID: \"ea655c41-9403-41cc-9bd1-e362ee5af607\") " pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.129196 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.150240 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwm8\" (UniqueName: \"kubernetes.io/projected/ea655c41-9403-41cc-9bd1-e362ee5af607-kube-api-access-bxwm8\") pod \"barbican-db-create-w2kh5\" (UID: \"ea655c41-9403-41cc-9bd1-e362ee5af607\") " pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.170123 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mrdhs"] Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.171079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.173126 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.173380 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-x4v4k" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.173501 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.173597 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.182898 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mrdhs"] Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.229881 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r726h\" (UniqueName: \"kubernetes.io/projected/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-kube-api-access-r726h\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.229944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-combined-ca-bundle\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.230012 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-config-data\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.230057 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgc9\" (UniqueName: \"kubernetes.io/projected/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89-kube-api-access-thgc9\") pod \"neutron-db-create-ngq5g\" (UID: \"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89\") " pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.264088 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.313094 4822 generic.go:334] "Generic (PLEG): container finished" podID="c6a3f897-e51e-4ecf-b173-e25d2f000a07" containerID="5d027935eca48c7d38a1daa3d0aabc0470acb3d9c745f8b2626dd36d25168207" exitCode=0 Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.313136 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hzr4f" event={"ID":"c6a3f897-e51e-4ecf-b173-e25d2f000a07","Type":"ContainerDied","Data":"5d027935eca48c7d38a1daa3d0aabc0470acb3d9c745f8b2626dd36d25168207"} Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.331218 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-config-data\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.331288 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thgc9\" (UniqueName: \"kubernetes.io/projected/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89-kube-api-access-thgc9\") pod \"neutron-db-create-ngq5g\" (UID: \"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89\") " pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.331325 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r726h\" (UniqueName: \"kubernetes.io/projected/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-kube-api-access-r726h\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.331356 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-combined-ca-bundle\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.335488 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-combined-ca-bundle\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.346825 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r726h\" (UniqueName: \"kubernetes.io/projected/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-kube-api-access-r726h\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.350659 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgc9\" (UniqueName: \"kubernetes.io/projected/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89-kube-api-access-thgc9\") pod \"neutron-db-create-ngq5g\" (UID: \"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89\") " pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.357029 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-config-data\") pod \"keystone-db-sync-mrdhs\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.426579 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.499493 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.860246 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w2kh5"] Oct 10 06:42:41 crc kubenswrapper[4822]: W1010 06:42:41.871566 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea655c41_9403_41cc_9bd1_e362ee5af607.slice/crio-0e7867f050b8eb5f6793cb899e6e04b3e656ef2ae0601ae2e91a666ef83cb3f7 WatchSource:0}: Error finding container 0e7867f050b8eb5f6793cb899e6e04b3e656ef2ae0601ae2e91a666ef83cb3f7: Status 404 returned error can't find the container with id 0e7867f050b8eb5f6793cb899e6e04b3e656ef2ae0601ae2e91a666ef83cb3f7 Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.944838 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ntwrt"] Oct 10 06:42:41 crc kubenswrapper[4822]: I1010 06:42:41.993991 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ngq5g"] Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.054495 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mrdhs"] Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.322289 4822 generic.go:334] "Generic (PLEG): container finished" podID="ea655c41-9403-41cc-9bd1-e362ee5af607" containerID="fd01c0ab2957061061b8b59ccbdfb85e03a9a4c49b19f380974e1975346c0050" exitCode=0 Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.322480 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w2kh5" event={"ID":"ea655c41-9403-41cc-9bd1-e362ee5af607","Type":"ContainerDied","Data":"fd01c0ab2957061061b8b59ccbdfb85e03a9a4c49b19f380974e1975346c0050"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.322528 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w2kh5" event={"ID":"ea655c41-9403-41cc-9bd1-e362ee5af607","Type":"ContainerStarted","Data":"0e7867f050b8eb5f6793cb899e6e04b3e656ef2ae0601ae2e91a666ef83cb3f7"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.324026 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ngq5g" event={"ID":"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89","Type":"ContainerStarted","Data":"43e47a1651cc659ca237db97e80e625b70db9b6622c2585b63079c7df3b404bd"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.324057 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ngq5g" event={"ID":"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89","Type":"ContainerStarted","Data":"df8ca77da4c02e750809c4573da71f1bc8e608b8fd9620289b3d0409396d5a62"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.328075 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.328100 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.328129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.328139 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.332580 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ntwrt" event={"ID":"25544312-5e94-49ce-8726-8259264add47","Type":"ContainerStarted","Data":"df44840d8672a385cb99037b5349829505288dc4ab0c0ddee1e133f2fee5d4e1"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.332627 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ntwrt" event={"ID":"25544312-5e94-49ce-8726-8259264add47","Type":"ContainerStarted","Data":"cb329d136e45ec961cab18fcce593be53e6dba4b44fdb51b9481299a5827af57"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.336149 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdhs" event={"ID":"4fb97152-e8c1-4fd0-befc-08ea47f79cdd","Type":"ContainerStarted","Data":"45b6ecb066a2d38c141a451ba252bd716ffd2f192e13818a4913bc73d7408b4c"} Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.365018 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-ngq5g" podStartSLOduration=1.365000375 podStartE2EDuration="1.365000375s" podCreationTimestamp="2025-10-10 06:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:42.358213028 +0000 UTC m=+1109.453371224" watchObservedRunningTime="2025-10-10 06:42:42.365000375 +0000 UTC m=+1109.460158571" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.378386 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-ntwrt" podStartSLOduration=2.378368853 podStartE2EDuration="2.378368853s" podCreationTimestamp="2025-10-10 06:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:42.372794082 +0000 UTC m=+1109.467952288" watchObservedRunningTime="2025-10-10 06:42:42.378368853 +0000 UTC m=+1109.473527049" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.735340 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.855035 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-db-sync-config-data\") pod \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.855137 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-combined-ca-bundle\") pod \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.855206 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-config-data\") pod \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.855242 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j5g2\" (UniqueName: \"kubernetes.io/projected/c6a3f897-e51e-4ecf-b173-e25d2f000a07-kube-api-access-5j5g2\") pod \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\" (UID: \"c6a3f897-e51e-4ecf-b173-e25d2f000a07\") " Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.860513 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a3f897-e51e-4ecf-b173-e25d2f000a07-kube-api-access-5j5g2" (OuterVolumeSpecName: "kube-api-access-5j5g2") pod "c6a3f897-e51e-4ecf-b173-e25d2f000a07" (UID: "c6a3f897-e51e-4ecf-b173-e25d2f000a07"). InnerVolumeSpecName "kube-api-access-5j5g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.862058 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c6a3f897-e51e-4ecf-b173-e25d2f000a07" (UID: "c6a3f897-e51e-4ecf-b173-e25d2f000a07"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.878354 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6a3f897-e51e-4ecf-b173-e25d2f000a07" (UID: "c6a3f897-e51e-4ecf-b173-e25d2f000a07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.899002 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-config-data" (OuterVolumeSpecName: "config-data") pod "c6a3f897-e51e-4ecf-b173-e25d2f000a07" (UID: "c6a3f897-e51e-4ecf-b173-e25d2f000a07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.958161 4822 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.958202 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.958215 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a3f897-e51e-4ecf-b173-e25d2f000a07-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:42 crc kubenswrapper[4822]: I1010 06:42:42.958228 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j5g2\" (UniqueName: \"kubernetes.io/projected/c6a3f897-e51e-4ecf-b173-e25d2f000a07-kube-api-access-5j5g2\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.351317 4822 generic.go:334] "Generic (PLEG): container finished" podID="25544312-5e94-49ce-8726-8259264add47" containerID="df44840d8672a385cb99037b5349829505288dc4ab0c0ddee1e133f2fee5d4e1" exitCode=0 Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.351392 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ntwrt" event={"ID":"25544312-5e94-49ce-8726-8259264add47","Type":"ContainerDied","Data":"df44840d8672a385cb99037b5349829505288dc4ab0c0ddee1e133f2fee5d4e1"} Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.364753 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hzr4f" event={"ID":"c6a3f897-e51e-4ecf-b173-e25d2f000a07","Type":"ContainerDied","Data":"3fb2a3432d6d3bc61103241490b854707ecae637ff641b11971e6aa2344faefd"} Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.364862 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb2a3432d6d3bc61103241490b854707ecae637ff641b11971e6aa2344faefd" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.364948 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hzr4f" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.370694 4822 generic.go:334] "Generic (PLEG): container finished" podID="8ae86b10-5f5b-4f05-a5d1-b39b3334ea89" containerID="43e47a1651cc659ca237db97e80e625b70db9b6622c2585b63079c7df3b404bd" exitCode=0 Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.371137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ngq5g" event={"ID":"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89","Type":"ContainerDied","Data":"43e47a1651cc659ca237db97e80e625b70db9b6622c2585b63079c7df3b404bd"} Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.707437 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.733376 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bbk57"] Oct 10 06:42:43 crc kubenswrapper[4822]: E1010 06:42:43.733703 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a3f897-e51e-4ecf-b173-e25d2f000a07" containerName="glance-db-sync" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.733720 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a3f897-e51e-4ecf-b173-e25d2f000a07" containerName="glance-db-sync" Oct 10 06:42:43 crc kubenswrapper[4822]: E1010 06:42:43.733761 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea655c41-9403-41cc-9bd1-e362ee5af607" containerName="mariadb-database-create" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.733767 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea655c41-9403-41cc-9bd1-e362ee5af607" containerName="mariadb-database-create" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.733927 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea655c41-9403-41cc-9bd1-e362ee5af607" containerName="mariadb-database-create" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.733950 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a3f897-e51e-4ecf-b173-e25d2f000a07" containerName="glance-db-sync" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.734831 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.761381 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bbk57"] Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.773926 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwm8\" (UniqueName: \"kubernetes.io/projected/ea655c41-9403-41cc-9bd1-e362ee5af607-kube-api-access-bxwm8\") pod \"ea655c41-9403-41cc-9bd1-e362ee5af607\" (UID: \"ea655c41-9403-41cc-9bd1-e362ee5af607\") " Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.774360 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-config\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.774465 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjmp\" (UniqueName: \"kubernetes.io/projected/f126a15f-f398-49ab-b17f-5ea5c3111603-kube-api-access-srjmp\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.774486 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.774672 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.774721 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.780260 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea655c41-9403-41cc-9bd1-e362ee5af607-kube-api-access-bxwm8" (OuterVolumeSpecName: "kube-api-access-bxwm8") pod "ea655c41-9403-41cc-9bd1-e362ee5af607" (UID: "ea655c41-9403-41cc-9bd1-e362ee5af607"). InnerVolumeSpecName "kube-api-access-bxwm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.876838 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.876931 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-config\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.877004 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjmp\" (UniqueName: \"kubernetes.io/projected/f126a15f-f398-49ab-b17f-5ea5c3111603-kube-api-access-srjmp\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.877031 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.877100 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.877165 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwm8\" (UniqueName: \"kubernetes.io/projected/ea655c41-9403-41cc-9bd1-e362ee5af607-kube-api-access-bxwm8\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.877842 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-config\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.877883 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.878440 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.878560 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:43 crc kubenswrapper[4822]: I1010 06:42:43.910604 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjmp\" (UniqueName: \"kubernetes.io/projected/f126a15f-f398-49ab-b17f-5ea5c3111603-kube-api-access-srjmp\") pod \"dnsmasq-dns-5b946c75cc-bbk57\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.054109 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.382279 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w2kh5" event={"ID":"ea655c41-9403-41cc-9bd1-e362ee5af607","Type":"ContainerDied","Data":"0e7867f050b8eb5f6793cb899e6e04b3e656ef2ae0601ae2e91a666ef83cb3f7"} Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.382317 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e7867f050b8eb5f6793cb899e6e04b3e656ef2ae0601ae2e91a666ef83cb3f7" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.382428 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w2kh5" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.595368 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bbk57"] Oct 10 06:42:44 crc kubenswrapper[4822]: W1010 06:42:44.605510 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf126a15f_f398_49ab_b17f_5ea5c3111603.slice/crio-b1eb840f08d87327e6f50d0eca70961b0ea597cd5eb903fc3cd06c94c2a421de WatchSource:0}: Error finding container b1eb840f08d87327e6f50d0eca70961b0ea597cd5eb903fc3cd06c94c2a421de: Status 404 returned error can't find the container with id b1eb840f08d87327e6f50d0eca70961b0ea597cd5eb903fc3cd06c94c2a421de Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.749868 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.756127 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.791281 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thgc9\" (UniqueName: \"kubernetes.io/projected/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89-kube-api-access-thgc9\") pod \"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89\" (UID: \"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89\") " Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.791388 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/25544312-5e94-49ce-8726-8259264add47-kube-api-access-d5c2w\") pod \"25544312-5e94-49ce-8726-8259264add47\" (UID: \"25544312-5e94-49ce-8726-8259264add47\") " Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.797644 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89-kube-api-access-thgc9" (OuterVolumeSpecName: "kube-api-access-thgc9") pod "8ae86b10-5f5b-4f05-a5d1-b39b3334ea89" (UID: "8ae86b10-5f5b-4f05-a5d1-b39b3334ea89"). InnerVolumeSpecName "kube-api-access-thgc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.798755 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25544312-5e94-49ce-8726-8259264add47-kube-api-access-d5c2w" (OuterVolumeSpecName: "kube-api-access-d5c2w") pod "25544312-5e94-49ce-8726-8259264add47" (UID: "25544312-5e94-49ce-8726-8259264add47"). InnerVolumeSpecName "kube-api-access-d5c2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.893184 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5c2w\" (UniqueName: \"kubernetes.io/projected/25544312-5e94-49ce-8726-8259264add47-kube-api-access-d5c2w\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:44 crc kubenswrapper[4822]: I1010 06:42:44.893232 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thgc9\" (UniqueName: \"kubernetes.io/projected/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89-kube-api-access-thgc9\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.393367 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" event={"ID":"f126a15f-f398-49ab-b17f-5ea5c3111603","Type":"ContainerStarted","Data":"b1eb840f08d87327e6f50d0eca70961b0ea597cd5eb903fc3cd06c94c2a421de"} Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.398088 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ngq5g" event={"ID":"8ae86b10-5f5b-4f05-a5d1-b39b3334ea89","Type":"ContainerDied","Data":"df8ca77da4c02e750809c4573da71f1bc8e608b8fd9620289b3d0409396d5a62"} Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.398134 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8ca77da4c02e750809c4573da71f1bc8e608b8fd9620289b3d0409396d5a62" Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.398202 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ngq5g" Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.400884 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ntwrt" event={"ID":"25544312-5e94-49ce-8726-8259264add47","Type":"ContainerDied","Data":"cb329d136e45ec961cab18fcce593be53e6dba4b44fdb51b9481299a5827af57"} Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.400927 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb329d136e45ec961cab18fcce593be53e6dba4b44fdb51b9481299a5827af57" Oct 10 06:42:45 crc kubenswrapper[4822]: I1010 06:42:45.400963 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ntwrt" Oct 10 06:42:47 crc kubenswrapper[4822]: I1010 06:42:47.418405 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" event={"ID":"f126a15f-f398-49ab-b17f-5ea5c3111603","Type":"ContainerStarted","Data":"d337aaebf645f62b8b6eb0bdee91ca4b1e332c605650c4eca63f0bf4ad6f4e8a"} Oct 10 06:42:48 crc kubenswrapper[4822]: I1010 06:42:48.433490 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12"} Oct 10 06:42:48 crc kubenswrapper[4822]: I1010 06:42:48.433837 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8"} Oct 10 06:42:48 crc kubenswrapper[4822]: I1010 06:42:48.435181 4822 generic.go:334] "Generic (PLEG): container finished" podID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerID="d337aaebf645f62b8b6eb0bdee91ca4b1e332c605650c4eca63f0bf4ad6f4e8a" exitCode=0 Oct 10 06:42:48 crc kubenswrapper[4822]: I1010 06:42:48.435221 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" event={"ID":"f126a15f-f398-49ab-b17f-5ea5c3111603","Type":"ContainerDied","Data":"d337aaebf645f62b8b6eb0bdee91ca4b1e332c605650c4eca63f0bf4ad6f4e8a"} Oct 10 06:42:49 crc kubenswrapper[4822]: I1010 06:42:49.447391 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" event={"ID":"f126a15f-f398-49ab-b17f-5ea5c3111603","Type":"ContainerStarted","Data":"0da5720ef5ba82c38ba9f5b49ff2921ee80894b7df23109d382bb6458487173d"} Oct 10 06:42:49 crc kubenswrapper[4822]: I1010 06:42:49.448077 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:49 crc kubenswrapper[4822]: I1010 06:42:49.454644 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9"} Oct 10 06:42:49 crc kubenswrapper[4822]: I1010 06:42:49.454679 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4"} Oct 10 06:42:49 crc kubenswrapper[4822]: I1010 06:42:49.470247 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" podStartSLOduration=6.470230467 podStartE2EDuration="6.470230467s" podCreationTimestamp="2025-10-10 06:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:49.468129946 +0000 UTC m=+1116.563288172" watchObservedRunningTime="2025-10-10 06:42:49.470230467 +0000 UTC m=+1116.565388663" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.843851 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5f70-account-create-4cjwb"] Oct 10 06:42:50 crc kubenswrapper[4822]: E1010 06:42:50.844654 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae86b10-5f5b-4f05-a5d1-b39b3334ea89" containerName="mariadb-database-create" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.844674 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae86b10-5f5b-4f05-a5d1-b39b3334ea89" containerName="mariadb-database-create" Oct 10 06:42:50 crc kubenswrapper[4822]: E1010 06:42:50.844710 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25544312-5e94-49ce-8726-8259264add47" containerName="mariadb-database-create" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.844719 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="25544312-5e94-49ce-8726-8259264add47" containerName="mariadb-database-create" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.845059 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae86b10-5f5b-4f05-a5d1-b39b3334ea89" containerName="mariadb-database-create" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.845081 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="25544312-5e94-49ce-8726-8259264add47" containerName="mariadb-database-create" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.845880 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.848283 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.851260 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5f70-account-create-4cjwb"] Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.896457 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjwcp\" (UniqueName: \"kubernetes.io/projected/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa-kube-api-access-pjwcp\") pod \"cinder-5f70-account-create-4cjwb\" (UID: \"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa\") " pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:50 crc kubenswrapper[4822]: I1010 06:42:50.998652 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjwcp\" (UniqueName: \"kubernetes.io/projected/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa-kube-api-access-pjwcp\") pod \"cinder-5f70-account-create-4cjwb\" (UID: \"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa\") " pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.026707 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjwcp\" (UniqueName: \"kubernetes.io/projected/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa-kube-api-access-pjwcp\") pod \"cinder-5f70-account-create-4cjwb\" (UID: \"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa\") " pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.045979 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9441-account-create-jrgnn"] Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.047189 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.050461 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.056442 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9441-account-create-jrgnn"] Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.100391 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhgd\" (UniqueName: \"kubernetes.io/projected/13a6f02a-7437-4e2b-8e89-319f47afc92f-kube-api-access-qjhgd\") pod \"barbican-9441-account-create-jrgnn\" (UID: \"13a6f02a-7437-4e2b-8e89-319f47afc92f\") " pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.173490 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.202039 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhgd\" (UniqueName: \"kubernetes.io/projected/13a6f02a-7437-4e2b-8e89-319f47afc92f-kube-api-access-qjhgd\") pod \"barbican-9441-account-create-jrgnn\" (UID: \"13a6f02a-7437-4e2b-8e89-319f47afc92f\") " pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.218450 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhgd\" (UniqueName: \"kubernetes.io/projected/13a6f02a-7437-4e2b-8e89-319f47afc92f-kube-api-access-qjhgd\") pod \"barbican-9441-account-create-jrgnn\" (UID: \"13a6f02a-7437-4e2b-8e89-319f47afc92f\") " pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.249028 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-834d-account-create-qlz2z"] Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.251939 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.255033 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-834d-account-create-qlz2z"] Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.255604 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.303304 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74gqj\" (UniqueName: \"kubernetes.io/projected/fbfe8139-a900-47aa-a6dd-64f3f69c6d08-kube-api-access-74gqj\") pod \"neutron-834d-account-create-qlz2z\" (UID: \"fbfe8139-a900-47aa-a6dd-64f3f69c6d08\") " pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.379318 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.406102 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74gqj\" (UniqueName: \"kubernetes.io/projected/fbfe8139-a900-47aa-a6dd-64f3f69c6d08-kube-api-access-74gqj\") pod \"neutron-834d-account-create-qlz2z\" (UID: \"fbfe8139-a900-47aa-a6dd-64f3f69c6d08\") " pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.427644 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74gqj\" (UniqueName: \"kubernetes.io/projected/fbfe8139-a900-47aa-a6dd-64f3f69c6d08-kube-api-access-74gqj\") pod \"neutron-834d-account-create-qlz2z\" (UID: \"fbfe8139-a900-47aa-a6dd-64f3f69c6d08\") " pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.497432 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d"} Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.501623 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdhs" event={"ID":"4fb97152-e8c1-4fd0-befc-08ea47f79cdd","Type":"ContainerStarted","Data":"505f7482009dbc32caf5f0d8b1c400586173489bde97d410893531db26d3a1f3"} Oct 10 06:42:51 crc kubenswrapper[4822]: I1010 06:42:51.519548 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mrdhs" podStartSLOduration=1.488341024 podStartE2EDuration="10.519527637s" podCreationTimestamp="2025-10-10 06:42:41 +0000 UTC" firstStartedPulling="2025-10-10 06:42:42.074154355 +0000 UTC m=+1109.169312551" lastFinishedPulling="2025-10-10 06:42:51.105340968 +0000 UTC m=+1118.200499164" observedRunningTime="2025-10-10 06:42:51.516071636 +0000 UTC m=+1118.611229832" watchObservedRunningTime="2025-10-10 06:42:51.519527637 +0000 UTC m=+1118.614685833" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:51.642391 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5f70-account-create-4cjwb"] Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:51.646251 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:51.806400 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9441-account-create-jrgnn"] Oct 10 06:42:53 crc kubenswrapper[4822]: W1010 06:42:51.815781 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13a6f02a_7437_4e2b_8e89_319f47afc92f.slice/crio-1a2495a57c14fbb449cf3e0e362e509e26f3f76c87ef18a1d23bd6c19bc525d3 WatchSource:0}: Error finding container 1a2495a57c14fbb449cf3e0e362e509e26f3f76c87ef18a1d23bd6c19bc525d3: Status 404 returned error can't find the container with id 1a2495a57c14fbb449cf3e0e362e509e26f3f76c87ef18a1d23bd6c19bc525d3 Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.517289 4822 generic.go:334] "Generic (PLEG): container finished" podID="13a6f02a-7437-4e2b-8e89-319f47afc92f" containerID="81f118e491ac50d56dc726f1e8d71d2fc29cd767138066b7b8edf50a7d2f3ad0" exitCode=0 Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.517358 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9441-account-create-jrgnn" event={"ID":"13a6f02a-7437-4e2b-8e89-319f47afc92f","Type":"ContainerDied","Data":"81f118e491ac50d56dc726f1e8d71d2fc29cd767138066b7b8edf50a7d2f3ad0"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.517567 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9441-account-create-jrgnn" event={"ID":"13a6f02a-7437-4e2b-8e89-319f47afc92f","Type":"ContainerStarted","Data":"1a2495a57c14fbb449cf3e0e362e509e26f3f76c87ef18a1d23bd6c19bc525d3"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.519594 4822 generic.go:334] "Generic (PLEG): container finished" podID="41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa" containerID="7e7cf45d4577b3ec598e972f0f930d14f2f5e126f34531bbd3ee99e6c96bb7e2" exitCode=0 Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.519669 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f70-account-create-4cjwb" event={"ID":"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa","Type":"ContainerDied","Data":"7e7cf45d4577b3ec598e972f0f930d14f2f5e126f34531bbd3ee99e6c96bb7e2"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.519729 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f70-account-create-4cjwb" event={"ID":"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa","Type":"ContainerStarted","Data":"f5f1ba16b162384bd52e820c80e6212f5e51cf66fa30486a6670c9672e3cd797"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.540244 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.540281 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerStarted","Data":"67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.597829 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.701772059 podStartE2EDuration="48.597790988s" podCreationTimestamp="2025-10-10 06:42:04 +0000 UTC" firstStartedPulling="2025-10-10 06:42:38.110433859 +0000 UTC m=+1105.205592055" lastFinishedPulling="2025-10-10 06:42:48.006452788 +0000 UTC m=+1115.101610984" observedRunningTime="2025-10-10 06:42:52.597230811 +0000 UTC m=+1119.692389007" watchObservedRunningTime="2025-10-10 06:42:52.597790988 +0000 UTC m=+1119.692949184" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.851256 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bbk57"] Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.851691 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerName="dnsmasq-dns" containerID="cri-o://0da5720ef5ba82c38ba9f5b49ff2921ee80894b7df23109d382bb6458487173d" gracePeriod=10 Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.892704 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g9hhc"] Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.894345 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.897149 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.913470 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g9hhc"] Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.946246 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.946401 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.946516 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.946624 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-config\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.946667 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85mw\" (UniqueName: \"kubernetes.io/projected/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-kube-api-access-d85mw\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:52.946740 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.047763 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.047834 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.047887 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.047919 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-config\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.047945 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d85mw\" (UniqueName: \"kubernetes.io/projected/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-kube-api-access-d85mw\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.047979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.048912 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.119141 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.119238 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.119273 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.120002 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-config\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.135176 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85mw\" (UniqueName: \"kubernetes.io/projected/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-kube-api-access-d85mw\") pod \"dnsmasq-dns-74f6bcbc87-g9hhc\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.227212 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.547926 4822 generic.go:334] "Generic (PLEG): container finished" podID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerID="0da5720ef5ba82c38ba9f5b49ff2921ee80894b7df23109d382bb6458487173d" exitCode=0 Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.549275 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" event={"ID":"f126a15f-f398-49ab-b17f-5ea5c3111603","Type":"ContainerDied","Data":"0da5720ef5ba82c38ba9f5b49ff2921ee80894b7df23109d382bb6458487173d"} Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.616119 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g9hhc"] Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.625526 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-834d-account-create-qlz2z"] Oct 10 06:42:53 crc kubenswrapper[4822]: W1010 06:42:53.640445 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbfe8139_a900_47aa_a6dd_64f3f69c6d08.slice/crio-8ed86e7d9eccaf9d59363dd651feb77648b1b0b00f4857bf8665cd4449b7a239 WatchSource:0}: Error finding container 8ed86e7d9eccaf9d59363dd651feb77648b1b0b00f4857bf8665cd4449b7a239: Status 404 returned error can't find the container with id 8ed86e7d9eccaf9d59363dd651feb77648b1b0b00f4857bf8665cd4449b7a239 Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.837542 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.926003 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.932971 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.970965 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-nb\") pod \"f126a15f-f398-49ab-b17f-5ea5c3111603\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.971024 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-config\") pod \"f126a15f-f398-49ab-b17f-5ea5c3111603\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.971051 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-dns-svc\") pod \"f126a15f-f398-49ab-b17f-5ea5c3111603\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.971175 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-sb\") pod \"f126a15f-f398-49ab-b17f-5ea5c3111603\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.971228 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srjmp\" (UniqueName: \"kubernetes.io/projected/f126a15f-f398-49ab-b17f-5ea5c3111603-kube-api-access-srjmp\") pod \"f126a15f-f398-49ab-b17f-5ea5c3111603\" (UID: \"f126a15f-f398-49ab-b17f-5ea5c3111603\") " Oct 10 06:42:53 crc kubenswrapper[4822]: I1010 06:42:53.977670 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f126a15f-f398-49ab-b17f-5ea5c3111603-kube-api-access-srjmp" (OuterVolumeSpecName: "kube-api-access-srjmp") pod "f126a15f-f398-49ab-b17f-5ea5c3111603" (UID: "f126a15f-f398-49ab-b17f-5ea5c3111603"). InnerVolumeSpecName "kube-api-access-srjmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.009518 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-config" (OuterVolumeSpecName: "config") pod "f126a15f-f398-49ab-b17f-5ea5c3111603" (UID: "f126a15f-f398-49ab-b17f-5ea5c3111603"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.011595 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f126a15f-f398-49ab-b17f-5ea5c3111603" (UID: "f126a15f-f398-49ab-b17f-5ea5c3111603"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.018074 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f126a15f-f398-49ab-b17f-5ea5c3111603" (UID: "f126a15f-f398-49ab-b17f-5ea5c3111603"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.020619 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f126a15f-f398-49ab-b17f-5ea5c3111603" (UID: "f126a15f-f398-49ab-b17f-5ea5c3111603"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.073186 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhgd\" (UniqueName: \"kubernetes.io/projected/13a6f02a-7437-4e2b-8e89-319f47afc92f-kube-api-access-qjhgd\") pod \"13a6f02a-7437-4e2b-8e89-319f47afc92f\" (UID: \"13a6f02a-7437-4e2b-8e89-319f47afc92f\") " Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.073314 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjwcp\" (UniqueName: \"kubernetes.io/projected/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa-kube-api-access-pjwcp\") pod \"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa\" (UID: \"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa\") " Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.074093 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.074114 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srjmp\" (UniqueName: \"kubernetes.io/projected/f126a15f-f398-49ab-b17f-5ea5c3111603-kube-api-access-srjmp\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.074128 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.074139 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.074149 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f126a15f-f398-49ab-b17f-5ea5c3111603-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.077322 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa-kube-api-access-pjwcp" (OuterVolumeSpecName: "kube-api-access-pjwcp") pod "41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa" (UID: "41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa"). InnerVolumeSpecName "kube-api-access-pjwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.078955 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a6f02a-7437-4e2b-8e89-319f47afc92f-kube-api-access-qjhgd" (OuterVolumeSpecName: "kube-api-access-qjhgd") pod "13a6f02a-7437-4e2b-8e89-319f47afc92f" (UID: "13a6f02a-7437-4e2b-8e89-319f47afc92f"). InnerVolumeSpecName "kube-api-access-qjhgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.175419 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjhgd\" (UniqueName: \"kubernetes.io/projected/13a6f02a-7437-4e2b-8e89-319f47afc92f-kube-api-access-qjhgd\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.175460 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjwcp\" (UniqueName: \"kubernetes.io/projected/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa-kube-api-access-pjwcp\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.560391 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" event={"ID":"f126a15f-f398-49ab-b17f-5ea5c3111603","Type":"ContainerDied","Data":"b1eb840f08d87327e6f50d0eca70961b0ea597cd5eb903fc3cd06c94c2a421de"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.560488 4822 scope.go:117] "RemoveContainer" containerID="0da5720ef5ba82c38ba9f5b49ff2921ee80894b7df23109d382bb6458487173d" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.560409 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-bbk57" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.562981 4822 generic.go:334] "Generic (PLEG): container finished" podID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerID="c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca" exitCode=0 Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.563074 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" event={"ID":"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5","Type":"ContainerDied","Data":"c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.563102 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" event={"ID":"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5","Type":"ContainerStarted","Data":"83944fd8e8a55e007d2ea8d70ae5c657cef7321ec235187642f327c0bf592e88"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.565637 4822 generic.go:334] "Generic (PLEG): container finished" podID="fbfe8139-a900-47aa-a6dd-64f3f69c6d08" containerID="00b110eb03f7b216ee678097c29a4872da8d3e2c4317e8feb9cf5c6c2b771324" exitCode=0 Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.565712 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-834d-account-create-qlz2z" event={"ID":"fbfe8139-a900-47aa-a6dd-64f3f69c6d08","Type":"ContainerDied","Data":"00b110eb03f7b216ee678097c29a4872da8d3e2c4317e8feb9cf5c6c2b771324"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.565775 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-834d-account-create-qlz2z" event={"ID":"fbfe8139-a900-47aa-a6dd-64f3f69c6d08","Type":"ContainerStarted","Data":"8ed86e7d9eccaf9d59363dd651feb77648b1b0b00f4857bf8665cd4449b7a239"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.569591 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9441-account-create-jrgnn" event={"ID":"13a6f02a-7437-4e2b-8e89-319f47afc92f","Type":"ContainerDied","Data":"1a2495a57c14fbb449cf3e0e362e509e26f3f76c87ef18a1d23bd6c19bc525d3"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.569638 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a2495a57c14fbb449cf3e0e362e509e26f3f76c87ef18a1d23bd6c19bc525d3" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.569599 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9441-account-create-jrgnn" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.571369 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f70-account-create-4cjwb" event={"ID":"41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa","Type":"ContainerDied","Data":"f5f1ba16b162384bd52e820c80e6212f5e51cf66fa30486a6670c9672e3cd797"} Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.571424 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f1ba16b162384bd52e820c80e6212f5e51cf66fa30486a6670c9672e3cd797" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.571514 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f70-account-create-4cjwb" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.595693 4822 scope.go:117] "RemoveContainer" containerID="d337aaebf645f62b8b6eb0bdee91ca4b1e332c605650c4eca63f0bf4ad6f4e8a" Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.806376 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bbk57"] Oct 10 06:42:54 crc kubenswrapper[4822]: I1010 06:42:54.818424 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-bbk57"] Oct 10 06:42:55 crc kubenswrapper[4822]: I1010 06:42:55.582948 4822 generic.go:334] "Generic (PLEG): container finished" podID="4fb97152-e8c1-4fd0-befc-08ea47f79cdd" containerID="505f7482009dbc32caf5f0d8b1c400586173489bde97d410893531db26d3a1f3" exitCode=0 Oct 10 06:42:55 crc kubenswrapper[4822]: I1010 06:42:55.583012 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdhs" event={"ID":"4fb97152-e8c1-4fd0-befc-08ea47f79cdd","Type":"ContainerDied","Data":"505f7482009dbc32caf5f0d8b1c400586173489bde97d410893531db26d3a1f3"} Oct 10 06:42:55 crc kubenswrapper[4822]: I1010 06:42:55.588621 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" event={"ID":"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5","Type":"ContainerStarted","Data":"242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6"} Oct 10 06:42:55 crc kubenswrapper[4822]: I1010 06:42:55.620400 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" podStartSLOduration=3.620380842 podStartE2EDuration="3.620380842s" podCreationTimestamp="2025-10-10 06:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:55.617133368 +0000 UTC m=+1122.712291584" watchObservedRunningTime="2025-10-10 06:42:55.620380842 +0000 UTC m=+1122.715539038" Oct 10 06:42:55 crc kubenswrapper[4822]: I1010 06:42:55.663715 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" path="/var/lib/kubelet/pods/f126a15f-f398-49ab-b17f-5ea5c3111603/volumes" Oct 10 06:42:55 crc kubenswrapper[4822]: I1010 06:42:55.934397 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.106697 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74gqj\" (UniqueName: \"kubernetes.io/projected/fbfe8139-a900-47aa-a6dd-64f3f69c6d08-kube-api-access-74gqj\") pod \"fbfe8139-a900-47aa-a6dd-64f3f69c6d08\" (UID: \"fbfe8139-a900-47aa-a6dd-64f3f69c6d08\") " Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.112104 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfe8139-a900-47aa-a6dd-64f3f69c6d08-kube-api-access-74gqj" (OuterVolumeSpecName: "kube-api-access-74gqj") pod "fbfe8139-a900-47aa-a6dd-64f3f69c6d08" (UID: "fbfe8139-a900-47aa-a6dd-64f3f69c6d08"). InnerVolumeSpecName "kube-api-access-74gqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.208691 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74gqj\" (UniqueName: \"kubernetes.io/projected/fbfe8139-a900-47aa-a6dd-64f3f69c6d08-kube-api-access-74gqj\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.598557 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-834d-account-create-qlz2z" event={"ID":"fbfe8139-a900-47aa-a6dd-64f3f69c6d08","Type":"ContainerDied","Data":"8ed86e7d9eccaf9d59363dd651feb77648b1b0b00f4857bf8665cd4449b7a239"} Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.598611 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ed86e7d9eccaf9d59363dd651feb77648b1b0b00f4857bf8665cd4449b7a239" Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.598732 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.598750 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-834d-account-create-qlz2z" Oct 10 06:42:56 crc kubenswrapper[4822]: I1010 06:42:56.907992 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.021692 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-config-data\") pod \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.021853 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r726h\" (UniqueName: \"kubernetes.io/projected/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-kube-api-access-r726h\") pod \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.021997 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-combined-ca-bundle\") pod \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\" (UID: \"4fb97152-e8c1-4fd0-befc-08ea47f79cdd\") " Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.026824 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-kube-api-access-r726h" (OuterVolumeSpecName: "kube-api-access-r726h") pod "4fb97152-e8c1-4fd0-befc-08ea47f79cdd" (UID: "4fb97152-e8c1-4fd0-befc-08ea47f79cdd"). InnerVolumeSpecName "kube-api-access-r726h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.045375 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fb97152-e8c1-4fd0-befc-08ea47f79cdd" (UID: "4fb97152-e8c1-4fd0-befc-08ea47f79cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.088438 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-config-data" (OuterVolumeSpecName: "config-data") pod "4fb97152-e8c1-4fd0-befc-08ea47f79cdd" (UID: "4fb97152-e8c1-4fd0-befc-08ea47f79cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.124168 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.124209 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.124227 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r726h\" (UniqueName: \"kubernetes.io/projected/4fb97152-e8c1-4fd0-befc-08ea47f79cdd-kube-api-access-r726h\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.608622 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdhs" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.608625 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdhs" event={"ID":"4fb97152-e8c1-4fd0-befc-08ea47f79cdd","Type":"ContainerDied","Data":"45b6ecb066a2d38c141a451ba252bd716ffd2f192e13818a4913bc73d7408b4c"} Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.608673 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b6ecb066a2d38c141a451ba252bd716ffd2f192e13818a4913bc73d7408b4c" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.871986 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g9hhc"] Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.878515 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-npscb"] Oct 10 06:42:57 crc kubenswrapper[4822]: E1010 06:42:57.882967 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerName="init" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883001 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerName="init" Oct 10 06:42:57 crc kubenswrapper[4822]: E1010 06:42:57.883031 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfe8139-a900-47aa-a6dd-64f3f69c6d08" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883038 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfe8139-a900-47aa-a6dd-64f3f69c6d08" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: E1010 06:42:57.883047 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerName="dnsmasq-dns" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883054 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerName="dnsmasq-dns" Oct 10 06:42:57 crc kubenswrapper[4822]: E1010 06:42:57.883079 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb97152-e8c1-4fd0-befc-08ea47f79cdd" containerName="keystone-db-sync" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883085 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb97152-e8c1-4fd0-befc-08ea47f79cdd" containerName="keystone-db-sync" Oct 10 06:42:57 crc kubenswrapper[4822]: E1010 06:42:57.883094 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a6f02a-7437-4e2b-8e89-319f47afc92f" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883099 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a6f02a-7437-4e2b-8e89-319f47afc92f" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: E1010 06:42:57.883112 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883118 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883377 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f126a15f-f398-49ab-b17f-5ea5c3111603" containerName="dnsmasq-dns" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883386 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883396 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a6f02a-7437-4e2b-8e89-319f47afc92f" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883406 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb97152-e8c1-4fd0-befc-08ea47f79cdd" containerName="keystone-db-sync" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.883419 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfe8139-a900-47aa-a6dd-64f3f69c6d08" containerName="mariadb-account-create" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.888616 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.895283 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-x4v4k" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.895477 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.895719 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.895885 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.897538 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-npscb"] Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.945462 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-combined-ca-bundle\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.945503 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-scripts\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.945525 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-fernet-keys\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.945578 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-credential-keys\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.945619 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkt4\" (UniqueName: \"kubernetes.io/projected/931f5d25-00d6-462d-b166-fbc36aff32f5-kube-api-access-ckkt4\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.945670 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-config-data\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.958491 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j6rtj"] Oct 10 06:42:57 crc kubenswrapper[4822]: I1010 06:42:57.961157 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.039420 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j6rtj"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.053221 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-config-data\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.053955 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.054092 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-combined-ca-bundle\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.054164 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-scripts\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.060006 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-fernet-keys\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070012 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtjf\" (UniqueName: \"kubernetes.io/projected/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-kube-api-access-6gtjf\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070130 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-config\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070266 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070351 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070462 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-credential-keys\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070559 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.070698 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkt4\" (UniqueName: \"kubernetes.io/projected/931f5d25-00d6-462d-b166-fbc36aff32f5-kube-api-access-ckkt4\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.071928 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-config-data\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.093408 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-scripts\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.107497 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-fernet-keys\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.110420 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-credential-keys\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.111748 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkt4\" (UniqueName: \"kubernetes.io/projected/931f5d25-00d6-462d-b166-fbc36aff32f5-kube-api-access-ckkt4\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.119248 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-combined-ca-bundle\") pod \"keystone-bootstrap-npscb\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.120855 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-t5dtz"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.132865 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.135194 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t5dtz"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.136350 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.141311 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mjpm2" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.141531 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-config\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178190 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178209 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178244 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178289 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-db-sync-config-data\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178318 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41b14558-f019-4f51-a3ab-b5689de6336a-etc-machine-id\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178359 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-combined-ca-bundle\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178412 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178444 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmngr\" (UniqueName: \"kubernetes.io/projected/41b14558-f019-4f51-a3ab-b5689de6336a-kube-api-access-wmngr\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178480 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-config-data\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178498 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtjf\" (UniqueName: \"kubernetes.io/projected/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-kube-api-access-6gtjf\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.178518 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-scripts\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.180097 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-config\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.180755 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.184068 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.186173 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.190431 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.205078 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.207325 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.213215 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.213987 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.224956 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j6rtj"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.231032 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtjf\" (UniqueName: \"kubernetes.io/projected/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-kube-api-access-6gtjf\") pod \"dnsmasq-dns-847c4cc679-j6rtj\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: E1010 06:42:58.231676 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-6gtjf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" podUID="1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.240258 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8jr42"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.241252 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.242570 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-npscb" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.253988 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.254178 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-974cx" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.254295 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.263012 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293789 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-scripts\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293856 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkfcd\" (UniqueName: \"kubernetes.io/projected/c82a5873-895d-4f27-ab6b-c264a59949a6-kube-api-access-kkfcd\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293880 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82a5873-895d-4f27-ab6b-c264a59949a6-logs\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293910 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-config-data\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293923 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-run-httpd\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293952 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-log-httpd\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293972 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-db-sync-config-data\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.293987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41b14558-f019-4f51-a3ab-b5689de6336a-etc-machine-id\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294006 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294038 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-combined-ca-bundle\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294079 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-config-data\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294122 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294139 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckl5\" (UniqueName: \"kubernetes.io/projected/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-kube-api-access-xckl5\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294158 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-combined-ca-bundle\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294178 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmngr\" (UniqueName: \"kubernetes.io/projected/41b14558-f019-4f51-a3ab-b5689de6336a-kube-api-access-wmngr\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294196 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-scripts\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294223 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-config-data\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.294240 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-scripts\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.297396 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41b14558-f019-4f51-a3ab-b5689de6336a-etc-machine-id\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.307981 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8jr42"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.316600 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-scripts\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.330990 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmngr\" (UniqueName: \"kubernetes.io/projected/41b14558-f019-4f51-a3ab-b5689de6336a-kube-api-access-wmngr\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.333696 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2twgn"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.335235 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.340049 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-db-sync-config-data\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.341366 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-config-data\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.342004 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-combined-ca-bundle\") pod \"cinder-db-sync-t5dtz\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.352671 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-plp78"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.363574 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.371215 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-58bsc" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.371419 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.371862 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2twgn"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.389251 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-plp78"] Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396525 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82a5873-895d-4f27-ab6b-c264a59949a6-logs\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396580 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-config\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396615 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-config-data\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396630 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-run-httpd\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396654 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-db-sync-config-data\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396678 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-combined-ca-bundle\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396709 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-log-httpd\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396758 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396782 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396887 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-config-data\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396932 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.396958 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397003 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xckl5\" (UniqueName: \"kubernetes.io/projected/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-kube-api-access-xckl5\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397024 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-combined-ca-bundle\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397058 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-scripts\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397080 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7hvf\" (UniqueName: \"kubernetes.io/projected/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-kube-api-access-q7hvf\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397109 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82a5873-895d-4f27-ab6b-c264a59949a6-logs\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397120 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397211 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-scripts\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397242 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g447g\" (UniqueName: \"kubernetes.io/projected/a1de65e8-9721-4039-8c67-9fbb0d715693-kube-api-access-g447g\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397292 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.397333 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkfcd\" (UniqueName: \"kubernetes.io/projected/c82a5873-895d-4f27-ab6b-c264a59949a6-kube-api-access-kkfcd\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.401542 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-log-httpd\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.401755 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-run-httpd\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.401758 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-config-data\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.401989 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-config-data\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.405269 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.407455 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-combined-ca-bundle\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.412023 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-scripts\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.412205 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-scripts\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.413509 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.422350 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkfcd\" (UniqueName: \"kubernetes.io/projected/c82a5873-895d-4f27-ab6b-c264a59949a6-kube-api-access-kkfcd\") pod \"placement-db-sync-8jr42\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.427038 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckl5\" (UniqueName: \"kubernetes.io/projected/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-kube-api-access-xckl5\") pod \"ceilometer-0\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.499379 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7hvf\" (UniqueName: \"kubernetes.io/projected/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-kube-api-access-q7hvf\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.499789 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.499895 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g447g\" (UniqueName: \"kubernetes.io/projected/a1de65e8-9721-4039-8c67-9fbb0d715693-kube-api-access-g447g\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.499936 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.499995 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-config\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.500049 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-db-sync-config-data\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.500080 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-combined-ca-bundle\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.500332 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.500399 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.500933 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.501068 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.501920 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.502789 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.503391 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-config\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.505650 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-db-sync-config-data\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.515613 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-combined-ca-bundle\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.521433 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g447g\" (UniqueName: \"kubernetes.io/projected/a1de65e8-9721-4039-8c67-9fbb0d715693-kube-api-access-g447g\") pod \"barbican-db-sync-plp78\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.523242 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7hvf\" (UniqueName: \"kubernetes.io/projected/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-kube-api-access-q7hvf\") pod \"dnsmasq-dns-785d8bcb8c-2twgn\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.565319 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.575953 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.615548 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.616106 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerName="dnsmasq-dns" containerID="cri-o://242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6" gracePeriod=10 Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.635013 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.705518 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jr42" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.711291 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-svc\") pod \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.711387 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-sb\") pod \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.711439 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-nb\") pod \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.711497 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-swift-storage-0\") pod \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.711522 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gtjf\" (UniqueName: \"kubernetes.io/projected/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-kube-api-access-6gtjf\") pod \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.711944 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" (UID: "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.712074 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" (UID: "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.712234 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" (UID: "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.712315 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-config\") pod \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\" (UID: \"1e67a6d9-5b0b-4085-a373-6633bb6ac6a0\") " Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.712324 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" (UID: "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.712704 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-config" (OuterVolumeSpecName: "config") pod "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" (UID: "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715224 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-kube-api-access-6gtjf" (OuterVolumeSpecName: "kube-api-access-6gtjf") pod "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" (UID: "1e67a6d9-5b0b-4085-a373-6633bb6ac6a0"). InnerVolumeSpecName "kube-api-access-6gtjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715529 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715541 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715550 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715558 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715583 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gtjf\" (UniqueName: \"kubernetes.io/projected/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-kube-api-access-6gtjf\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.715591 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.727586 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.733608 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-plp78" Oct 10 06:42:58 crc kubenswrapper[4822]: I1010 06:42:58.809604 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-npscb"] Oct 10 06:42:58 crc kubenswrapper[4822]: W1010 06:42:58.811512 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod931f5d25_00d6_462d_b166_fbc36aff32f5.slice/crio-1197cf2b13bbe4f0773386184ef3354f981f0ff06160aef8d050ea3414d1eddf WatchSource:0}: Error finding container 1197cf2b13bbe4f0773386184ef3354f981f0ff06160aef8d050ea3414d1eddf: Status 404 returned error can't find the container with id 1197cf2b13bbe4f0773386184ef3354f981f0ff06160aef8d050ea3414d1eddf Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.054343 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.056437 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.061201 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.061330 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.061498 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-twbbx" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.061635 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.125682 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t5dtz"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128335 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128395 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128442 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128507 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128570 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknhj\" (UniqueName: \"kubernetes.io/projected/204e0c54-6e8b-4cb2-961e-2233030b40f4-kube-api-access-dknhj\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128593 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128664 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.128714 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.138321 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.170216 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.172528 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: W1010 06:42:59.175946 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe4da4cf_3be6_4ea1_a01c_7c7333816cb8.slice/crio-9df45187618bde4d7d9a1b5829bd2f6dd2e19bedc710f4861c99cc55a81b08b6 WatchSource:0}: Error finding container 9df45187618bde4d7d9a1b5829bd2f6dd2e19bedc710f4861c99cc55a81b08b6: Status 404 returned error can't find the container with id 9df45187618bde4d7d9a1b5829bd2f6dd2e19bedc710f4861c99cc55a81b08b6 Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.177058 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.177363 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.178444 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.186386 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.231512 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dknhj\" (UniqueName: \"kubernetes.io/projected/204e0c54-6e8b-4cb2-961e-2233030b40f4-kube-api-access-dknhj\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.231854 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.231904 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.231947 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.231992 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.232018 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.232038 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.232077 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.233354 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.235195 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.235438 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.239274 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.241141 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.241716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.249014 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknhj\" (UniqueName: \"kubernetes.io/projected/204e0c54-6e8b-4cb2-961e-2233030b40f4-kube-api-access-dknhj\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.249018 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.271035 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.325316 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2twgn"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333127 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-scripts\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333214 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333232 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333309 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-logs\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333362 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333383 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvhc\" (UniqueName: \"kubernetes.io/projected/45e54f1b-b0e0-477c-818e-5beabea6611d-kube-api-access-pnvhc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333419 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.333446 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-config-data\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.337315 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-plp78"] Oct 10 06:42:59 crc kubenswrapper[4822]: W1010 06:42:59.339282 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1de65e8_9721_4039_8c67_9fbb0d715693.slice/crio-3de9a178f94bf8bb14804ad873b10ef33bb24938587b177d64e602f3e53ec6b2 WatchSource:0}: Error finding container 3de9a178f94bf8bb14804ad873b10ef33bb24938587b177d64e602f3e53ec6b2: Status 404 returned error can't find the container with id 3de9a178f94bf8bb14804ad873b10ef33bb24938587b177d64e602f3e53ec6b2 Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.358345 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.368313 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.435734 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-logs\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436044 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436076 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvhc\" (UniqueName: \"kubernetes.io/projected/45e54f1b-b0e0-477c-818e-5beabea6611d-kube-api-access-pnvhc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436114 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436144 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-config-data\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436239 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-logs\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436242 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-scripts\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436297 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.436318 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.437476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.437581 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.440990 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-scripts\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.451890 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-config-data\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.452490 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.463848 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvhc\" (UniqueName: \"kubernetes.io/projected/45e54f1b-b0e0-477c-818e-5beabea6611d-kube-api-access-pnvhc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.471978 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.497782 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.513640 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8jr42"] Oct 10 06:42:59 crc kubenswrapper[4822]: W1010 06:42:59.525491 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc82a5873_895d_4f27_ab6b_c264a59949a6.slice/crio-84f361ce6cce96faecb70569b3a66ff06917175bc7bb74640af7a9be4339afa8 WatchSource:0}: Error finding container 84f361ce6cce96faecb70569b3a66ff06917175bc7bb74640af7a9be4339afa8: Status 404 returned error can't find the container with id 84f361ce6cce96faecb70569b3a66ff06917175bc7bb74640af7a9be4339afa8 Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.538536 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-nb\") pod \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.538775 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-swift-storage-0\") pod \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.538839 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-svc\") pod \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.538858 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d85mw\" (UniqueName: \"kubernetes.io/projected/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-kube-api-access-d85mw\") pod \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.538943 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-config\") pod \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.539024 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-sb\") pod \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\" (UID: \"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5\") " Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.546871 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-kube-api-access-d85mw" (OuterVolumeSpecName: "kube-api-access-d85mw") pod "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" (UID: "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5"). InnerVolumeSpecName "kube-api-access-d85mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.613427 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" (UID: "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.617576 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-config" (OuterVolumeSpecName: "config") pod "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" (UID: "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.629601 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" (UID: "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.631270 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t5dtz" event={"ID":"41b14558-f019-4f51-a3ab-b5689de6336a","Type":"ContainerStarted","Data":"3becba294ebd3a5d27f9b5f101e2b78bd5911f0762ae9c88b947612f4d20a9c9"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.631896 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" (UID: "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.634153 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-npscb" event={"ID":"931f5d25-00d6-462d-b166-fbc36aff32f5","Type":"ContainerStarted","Data":"c07fb29fe58638b86ce3b23d65282f35d480c438268a6fc34c0920e7d3904766"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.634188 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-npscb" event={"ID":"931f5d25-00d6-462d-b166-fbc36aff32f5","Type":"ContainerStarted","Data":"1197cf2b13bbe4f0773386184ef3354f981f0ff06160aef8d050ea3414d1eddf"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.636701 4822 generic.go:334] "Generic (PLEG): container finished" podID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerID="49a62a065ff824c72d62c05f8af2fd8d6f353aec586bfd2f8c9eb655880e1638" exitCode=0 Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.636827 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" event={"ID":"56ec6334-97d6-4fa8-8f14-ce44ab82aa15","Type":"ContainerDied","Data":"49a62a065ff824c72d62c05f8af2fd8d6f353aec586bfd2f8c9eb655880e1638"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.636850 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" event={"ID":"56ec6334-97d6-4fa8-8f14-ce44ab82aa15","Type":"ContainerStarted","Data":"c98624a7c1c42d6180bc26e56111fbc01bf472396c68be1aa9951cd140730b1a"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640571 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640598 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640607 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640615 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d85mw\" (UniqueName: \"kubernetes.io/projected/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-kube-api-access-d85mw\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640624 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640937 4822 generic.go:334] "Generic (PLEG): container finished" podID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerID="242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6" exitCode=0 Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.640985 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" event={"ID":"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5","Type":"ContainerDied","Data":"242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.641025 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" event={"ID":"0c8b1bfe-fe28-43d2-bca4-d0772934e6d5","Type":"ContainerDied","Data":"83944fd8e8a55e007d2ea8d70ae5c657cef7321ec235187642f327c0bf592e88"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.641040 4822 scope.go:117] "RemoveContainer" containerID="242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.641135 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g9hhc" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.642776 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jr42" event={"ID":"c82a5873-895d-4f27-ab6b-c264a59949a6","Type":"ContainerStarted","Data":"84f361ce6cce96faecb70569b3a66ff06917175bc7bb74640af7a9be4339afa8"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.644302 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerStarted","Data":"9df45187618bde4d7d9a1b5829bd2f6dd2e19bedc710f4861c99cc55a81b08b6"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.646088 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" (UID: "0c8b1bfe-fe28-43d2-bca4-d0772934e6d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.650931 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j6rtj" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.651235 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-plp78" event={"ID":"a1de65e8-9721-4039-8c67-9fbb0d715693","Type":"ContainerStarted","Data":"3de9a178f94bf8bb14804ad873b10ef33bb24938587b177d64e602f3e53ec6b2"} Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.659253 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-npscb" podStartSLOduration=2.659209508 podStartE2EDuration="2.659209508s" podCreationTimestamp="2025-10-10 06:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:42:59.6479095 +0000 UTC m=+1126.743067716" watchObservedRunningTime="2025-10-10 06:42:59.659209508 +0000 UTC m=+1126.754367704" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.666671 4822 scope.go:117] "RemoveContainer" containerID="c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.692890 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.713351 4822 scope.go:117] "RemoveContainer" containerID="242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6" Oct 10 06:42:59 crc kubenswrapper[4822]: E1010 06:42:59.714619 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6\": container with ID starting with 242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6 not found: ID does not exist" containerID="242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.714670 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6"} err="failed to get container status \"242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6\": rpc error: code = NotFound desc = could not find container \"242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6\": container with ID starting with 242a8579c09f9046e4906d99c8d3765ffec2b8cb9fc61e97609a33c87388c2d6 not found: ID does not exist" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.714697 4822 scope.go:117] "RemoveContainer" containerID="c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca" Oct 10 06:42:59 crc kubenswrapper[4822]: E1010 06:42:59.715203 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca\": container with ID starting with c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca not found: ID does not exist" containerID="c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.715260 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca"} err="failed to get container status \"c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca\": rpc error: code = NotFound desc = could not find container \"c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca\": container with ID starting with c0f9d1c9f4e3fce29dba72ff596ca97d851f1a212a771f8a6c5065f9f7baa3ca not found: ID does not exist" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.740429 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j6rtj"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.742825 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.750294 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j6rtj"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.981468 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g9hhc"] Oct 10 06:42:59 crc kubenswrapper[4822]: I1010 06:42:59.994005 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g9hhc"] Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.005950 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.158675 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.244867 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.324638 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.420077 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.679256 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" event={"ID":"56ec6334-97d6-4fa8-8f14-ce44ab82aa15","Type":"ContainerStarted","Data":"b6c57ce6bc8804ab9c4e266d92708e442dcf3712d835baf3e1ad4258627ebf92"} Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.679581 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.682020 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"45e54f1b-b0e0-477c-818e-5beabea6611d","Type":"ContainerStarted","Data":"360b55f4510a9bcce0914f25213a762d28d89ac03c3efadd700cd82bb911412e"} Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.693087 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204e0c54-6e8b-4cb2-961e-2233030b40f4","Type":"ContainerStarted","Data":"5ec58751ad7f6f7566129c8ece4bf0491aa15add50449cc18b15818e1c4113d0"} Oct 10 06:43:00 crc kubenswrapper[4822]: I1010 06:43:00.705994 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" podStartSLOduration=2.705972005 podStartE2EDuration="2.705972005s" podCreationTimestamp="2025-10-10 06:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:00.696488219 +0000 UTC m=+1127.791646425" watchObservedRunningTime="2025-10-10 06:43:00.705972005 +0000 UTC m=+1127.801130201" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.336440 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.336491 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.563070 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qqsbv"] Oct 10 06:43:01 crc kubenswrapper[4822]: E1010 06:43:01.563901 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerName="dnsmasq-dns" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.563922 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerName="dnsmasq-dns" Oct 10 06:43:01 crc kubenswrapper[4822]: E1010 06:43:01.563945 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerName="init" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.563953 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerName="init" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.564219 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" containerName="dnsmasq-dns" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.565033 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.569761 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qqsbv"] Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.570592 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.570888 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8snl8" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.572224 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.664716 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c8b1bfe-fe28-43d2-bca4-d0772934e6d5" path="/var/lib/kubelet/pods/0c8b1bfe-fe28-43d2-bca4-d0772934e6d5/volumes" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.666237 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e67a6d9-5b0b-4085-a373-6633bb6ac6a0" path="/var/lib/kubelet/pods/1e67a6d9-5b0b-4085-a373-6633bb6ac6a0/volumes" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.693616 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-config\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.693701 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7w2\" (UniqueName: \"kubernetes.io/projected/3d3156ed-785b-492d-923f-cbd97a996b43-kube-api-access-lb7w2\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.693722 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-combined-ca-bundle\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.716887 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204e0c54-6e8b-4cb2-961e-2233030b40f4","Type":"ContainerStarted","Data":"16d4ffadb84b60d504c9b8da0cb9aafee3acad779b7420618b961baaa7bfdfb9"} Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.721765 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"45e54f1b-b0e0-477c-818e-5beabea6611d","Type":"ContainerStarted","Data":"44872c57530e80362bc4d529842e8600690e5ab439f2a56cd4ae26f0210dbacf"} Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.795346 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-config\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.795487 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7w2\" (UniqueName: \"kubernetes.io/projected/3d3156ed-785b-492d-923f-cbd97a996b43-kube-api-access-lb7w2\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.796017 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-combined-ca-bundle\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.801597 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-config\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.801607 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-combined-ca-bundle\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.813696 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7w2\" (UniqueName: \"kubernetes.io/projected/3d3156ed-785b-492d-923f-cbd97a996b43-kube-api-access-lb7w2\") pod \"neutron-db-sync-qqsbv\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:01 crc kubenswrapper[4822]: I1010 06:43:01.915906 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.509146 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qqsbv"] Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.733234 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-httpd" containerID="cri-o://6393c6148e1ccaed1dfc18210b26d7ce34617da6722f27236a94d2515ff58d03" gracePeriod=30 Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.733212 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-log" containerID="cri-o://16d4ffadb84b60d504c9b8da0cb9aafee3acad779b7420618b961baaa7bfdfb9" gracePeriod=30 Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.733161 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204e0c54-6e8b-4cb2-961e-2233030b40f4","Type":"ContainerStarted","Data":"6393c6148e1ccaed1dfc18210b26d7ce34617da6722f27236a94d2515ff58d03"} Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.737759 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"45e54f1b-b0e0-477c-818e-5beabea6611d","Type":"ContainerStarted","Data":"e6affdff48e5a58345328789f746db1fdd55f7e30e98b070cc479d76ad62da0f"} Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.737944 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-log" containerID="cri-o://44872c57530e80362bc4d529842e8600690e5ab439f2a56cd4ae26f0210dbacf" gracePeriod=30 Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.738079 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-httpd" containerID="cri-o://e6affdff48e5a58345328789f746db1fdd55f7e30e98b070cc479d76ad62da0f" gracePeriod=30 Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.755963 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.755941804 podStartE2EDuration="5.755941804s" podCreationTimestamp="2025-10-10 06:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:02.749428705 +0000 UTC m=+1129.844586911" watchObservedRunningTime="2025-10-10 06:43:02.755941804 +0000 UTC m=+1129.851100000" Oct 10 06:43:02 crc kubenswrapper[4822]: I1010 06:43:02.780620 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.7805940190000005 podStartE2EDuration="4.780594019s" podCreationTimestamp="2025-10-10 06:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:02.771274789 +0000 UTC m=+1129.866433005" watchObservedRunningTime="2025-10-10 06:43:02.780594019 +0000 UTC m=+1129.875752225" Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.760346 4822 generic.go:334] "Generic (PLEG): container finished" podID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerID="e6affdff48e5a58345328789f746db1fdd55f7e30e98b070cc479d76ad62da0f" exitCode=0 Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.760649 4822 generic.go:334] "Generic (PLEG): container finished" podID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerID="44872c57530e80362bc4d529842e8600690e5ab439f2a56cd4ae26f0210dbacf" exitCode=143 Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.760709 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"45e54f1b-b0e0-477c-818e-5beabea6611d","Type":"ContainerDied","Data":"e6affdff48e5a58345328789f746db1fdd55f7e30e98b070cc479d76ad62da0f"} Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.760735 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"45e54f1b-b0e0-477c-818e-5beabea6611d","Type":"ContainerDied","Data":"44872c57530e80362bc4d529842e8600690e5ab439f2a56cd4ae26f0210dbacf"} Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.763412 4822 generic.go:334] "Generic (PLEG): container finished" podID="931f5d25-00d6-462d-b166-fbc36aff32f5" containerID="c07fb29fe58638b86ce3b23d65282f35d480c438268a6fc34c0920e7d3904766" exitCode=0 Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.763468 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-npscb" event={"ID":"931f5d25-00d6-462d-b166-fbc36aff32f5","Type":"ContainerDied","Data":"c07fb29fe58638b86ce3b23d65282f35d480c438268a6fc34c0920e7d3904766"} Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.766319 4822 generic.go:334] "Generic (PLEG): container finished" podID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerID="6393c6148e1ccaed1dfc18210b26d7ce34617da6722f27236a94d2515ff58d03" exitCode=0 Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.766348 4822 generic.go:334] "Generic (PLEG): container finished" podID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerID="16d4ffadb84b60d504c9b8da0cb9aafee3acad779b7420618b961baaa7bfdfb9" exitCode=143 Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.766373 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204e0c54-6e8b-4cb2-961e-2233030b40f4","Type":"ContainerDied","Data":"6393c6148e1ccaed1dfc18210b26d7ce34617da6722f27236a94d2515ff58d03"} Oct 10 06:43:03 crc kubenswrapper[4822]: I1010 06:43:03.766398 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204e0c54-6e8b-4cb2-961e-2233030b40f4","Type":"ContainerDied","Data":"16d4ffadb84b60d504c9b8da0cb9aafee3acad779b7420618b961baaa7bfdfb9"} Oct 10 06:43:04 crc kubenswrapper[4822]: W1010 06:43:04.083099 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d3156ed_785b_492d_923f_cbd97a996b43.slice/crio-472347ca63ec3b19aad8df0899d083de054f55e7f7d4bae996c9b107551cbbe2 WatchSource:0}: Error finding container 472347ca63ec3b19aad8df0899d083de054f55e7f7d4bae996c9b107551cbbe2: Status 404 returned error can't find the container with id 472347ca63ec3b19aad8df0899d083de054f55e7f7d4bae996c9b107551cbbe2 Oct 10 06:43:04 crc kubenswrapper[4822]: I1010 06:43:04.789438 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqsbv" event={"ID":"3d3156ed-785b-492d-923f-cbd97a996b43","Type":"ContainerStarted","Data":"472347ca63ec3b19aad8df0899d083de054f55e7f7d4bae996c9b107551cbbe2"} Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.458689 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-npscb" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.605264 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-credential-keys\") pod \"931f5d25-00d6-462d-b166-fbc36aff32f5\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.605402 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-combined-ca-bundle\") pod \"931f5d25-00d6-462d-b166-fbc36aff32f5\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.605459 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-scripts\") pod \"931f5d25-00d6-462d-b166-fbc36aff32f5\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.605476 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-fernet-keys\") pod \"931f5d25-00d6-462d-b166-fbc36aff32f5\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.605538 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-config-data\") pod \"931f5d25-00d6-462d-b166-fbc36aff32f5\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.605564 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkt4\" (UniqueName: \"kubernetes.io/projected/931f5d25-00d6-462d-b166-fbc36aff32f5-kube-api-access-ckkt4\") pod \"931f5d25-00d6-462d-b166-fbc36aff32f5\" (UID: \"931f5d25-00d6-462d-b166-fbc36aff32f5\") " Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.611520 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "931f5d25-00d6-462d-b166-fbc36aff32f5" (UID: "931f5d25-00d6-462d-b166-fbc36aff32f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.612373 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "931f5d25-00d6-462d-b166-fbc36aff32f5" (UID: "931f5d25-00d6-462d-b166-fbc36aff32f5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.622929 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931f5d25-00d6-462d-b166-fbc36aff32f5-kube-api-access-ckkt4" (OuterVolumeSpecName: "kube-api-access-ckkt4") pod "931f5d25-00d6-462d-b166-fbc36aff32f5" (UID: "931f5d25-00d6-462d-b166-fbc36aff32f5"). InnerVolumeSpecName "kube-api-access-ckkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.625960 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-scripts" (OuterVolumeSpecName: "scripts") pod "931f5d25-00d6-462d-b166-fbc36aff32f5" (UID: "931f5d25-00d6-462d-b166-fbc36aff32f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.682620 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-config-data" (OuterVolumeSpecName: "config-data") pod "931f5d25-00d6-462d-b166-fbc36aff32f5" (UID: "931f5d25-00d6-462d-b166-fbc36aff32f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.697858 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "931f5d25-00d6-462d-b166-fbc36aff32f5" (UID: "931f5d25-00d6-462d-b166-fbc36aff32f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.708090 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.708342 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.708422 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.708493 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.708585 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkt4\" (UniqueName: \"kubernetes.io/projected/931f5d25-00d6-462d-b166-fbc36aff32f5-kube-api-access-ckkt4\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.708671 4822 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/931f5d25-00d6-462d-b166-fbc36aff32f5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.808317 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-npscb" event={"ID":"931f5d25-00d6-462d-b166-fbc36aff32f5","Type":"ContainerDied","Data":"1197cf2b13bbe4f0773386184ef3354f981f0ff06160aef8d050ea3414d1eddf"} Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.808368 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1197cf2b13bbe4f0773386184ef3354f981f0ff06160aef8d050ea3414d1eddf" Oct 10 06:43:06 crc kubenswrapper[4822]: I1010 06:43:06.808430 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-npscb" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.533941 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-npscb"] Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.539464 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-npscb"] Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.635064 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mkfd6"] Oct 10 06:43:07 crc kubenswrapper[4822]: E1010 06:43:07.635457 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931f5d25-00d6-462d-b166-fbc36aff32f5" containerName="keystone-bootstrap" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.635470 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="931f5d25-00d6-462d-b166-fbc36aff32f5" containerName="keystone-bootstrap" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.635637 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="931f5d25-00d6-462d-b166-fbc36aff32f5" containerName="keystone-bootstrap" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.636207 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.640157 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-x4v4k" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.640460 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.641594 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.642354 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.665718 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931f5d25-00d6-462d-b166-fbc36aff32f5" path="/var/lib/kubelet/pods/931f5d25-00d6-462d-b166-fbc36aff32f5/volumes" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.666301 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mkfd6"] Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.829693 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-scripts\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.829811 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsd95\" (UniqueName: \"kubernetes.io/projected/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-kube-api-access-lsd95\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.829845 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-fernet-keys\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.829884 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-combined-ca-bundle\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.829915 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-config-data\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.829954 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-credential-keys\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.931904 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-credential-keys\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.932045 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-scripts\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.932113 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsd95\" (UniqueName: \"kubernetes.io/projected/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-kube-api-access-lsd95\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.932152 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-fernet-keys\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.932179 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-combined-ca-bundle\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.932224 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-config-data\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.942366 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-fernet-keys\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.942670 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-config-data\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.944935 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-credential-keys\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.947267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-combined-ca-bundle\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.949524 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-scripts\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.955515 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsd95\" (UniqueName: \"kubernetes.io/projected/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-kube-api-access-lsd95\") pod \"keystone-bootstrap-mkfd6\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:07 crc kubenswrapper[4822]: I1010 06:43:07.957544 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:08 crc kubenswrapper[4822]: I1010 06:43:08.729933 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:43:08 crc kubenswrapper[4822]: I1010 06:43:08.786835 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2x4k7"] Oct 10 06:43:08 crc kubenswrapper[4822]: I1010 06:43:08.787064 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-2x4k7" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="dnsmasq-dns" containerID="cri-o://5233ce6a762e0754c515f119b497b8ddc51c7df347fc9453f4cfb86c35c752f7" gracePeriod=10 Oct 10 06:43:08 crc kubenswrapper[4822]: I1010 06:43:08.996632 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.155956 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-config-data\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.155997 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-logs\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.156025 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-combined-ca-bundle\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.156075 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-scripts\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.156106 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnvhc\" (UniqueName: \"kubernetes.io/projected/45e54f1b-b0e0-477c-818e-5beabea6611d-kube-api-access-pnvhc\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.156145 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-httpd-run\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.156164 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.156200 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-public-tls-certs\") pod \"45e54f1b-b0e0-477c-818e-5beabea6611d\" (UID: \"45e54f1b-b0e0-477c-818e-5beabea6611d\") " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.157220 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-logs" (OuterVolumeSpecName: "logs") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.157263 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.161943 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-scripts" (OuterVolumeSpecName: "scripts") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.162257 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e54f1b-b0e0-477c-818e-5beabea6611d-kube-api-access-pnvhc" (OuterVolumeSpecName: "kube-api-access-pnvhc") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "kube-api-access-pnvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.197035 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.197518 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.209626 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.230135 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-config-data" (OuterVolumeSpecName: "config-data") pod "45e54f1b-b0e0-477c-818e-5beabea6611d" (UID: "45e54f1b-b0e0-477c-818e-5beabea6611d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257708 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnvhc\" (UniqueName: \"kubernetes.io/projected/45e54f1b-b0e0-477c-818e-5beabea6611d-kube-api-access-pnvhc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257742 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257775 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257788 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257812 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257820 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e54f1b-b0e0-477c-818e-5beabea6611d-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257828 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.257836 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45e54f1b-b0e0-477c-818e-5beabea6611d-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.278656 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.359121 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.710061 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2x4k7" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.839727 4822 generic.go:334] "Generic (PLEG): container finished" podID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerID="5233ce6a762e0754c515f119b497b8ddc51c7df347fc9453f4cfb86c35c752f7" exitCode=0 Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.839780 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2x4k7" event={"ID":"e6ea6247-cd17-4b6e-b71f-b916243da4b6","Type":"ContainerDied","Data":"5233ce6a762e0754c515f119b497b8ddc51c7df347fc9453f4cfb86c35c752f7"} Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.841411 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"45e54f1b-b0e0-477c-818e-5beabea6611d","Type":"ContainerDied","Data":"360b55f4510a9bcce0914f25213a762d28d89ac03c3efadd700cd82bb911412e"} Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.841442 4822 scope.go:117] "RemoveContainer" containerID="e6affdff48e5a58345328789f746db1fdd55f7e30e98b070cc479d76ad62da0f" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.841473 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.864590 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.870134 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.894343 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:09 crc kubenswrapper[4822]: E1010 06:43:09.894923 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-httpd" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.894944 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-httpd" Oct 10 06:43:09 crc kubenswrapper[4822]: E1010 06:43:09.894959 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-log" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.894966 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-log" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.895163 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-httpd" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.895192 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" containerName="glance-log" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.896356 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.902368 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.902490 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.905266 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972228 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-logs\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972286 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972484 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-config-data\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972541 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972567 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-scripts\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972592 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972658 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:09 crc kubenswrapper[4822]: I1010 06:43:09.972708 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47zrn\" (UniqueName: \"kubernetes.io/projected/93b88728-7582-4106-8d27-cf2644ca1960-kube-api-access-47zrn\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074054 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-logs\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074113 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074195 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-config-data\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074212 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074234 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-scripts\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074256 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074281 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074303 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47zrn\" (UniqueName: \"kubernetes.io/projected/93b88728-7582-4106-8d27-cf2644ca1960-kube-api-access-47zrn\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074689 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-logs\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.074752 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.075389 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.080462 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.080574 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.080574 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-scripts\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.081746 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-config-data\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.093234 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47zrn\" (UniqueName: \"kubernetes.io/projected/93b88728-7582-4106-8d27-cf2644ca1960-kube-api-access-47zrn\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.103776 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " pod="openstack/glance-default-external-api-0" Oct 10 06:43:10 crc kubenswrapper[4822]: I1010 06:43:10.222094 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:43:11 crc kubenswrapper[4822]: I1010 06:43:11.664187 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e54f1b-b0e0-477c-818e-5beabea6611d" path="/var/lib/kubelet/pods/45e54f1b-b0e0-477c-818e-5beabea6611d/volumes" Oct 10 06:43:14 crc kubenswrapper[4822]: I1010 06:43:14.710548 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2x4k7" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Oct 10 06:43:16 crc kubenswrapper[4822]: E1010 06:43:16.746948 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 10 06:43:16 crc kubenswrapper[4822]: E1010 06:43:16.747463 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g447g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-plp78_openstack(a1de65e8-9721-4039-8c67-9fbb0d715693): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 06:43:16 crc kubenswrapper[4822]: E1010 06:43:16.748701 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-plp78" podUID="a1de65e8-9721-4039-8c67-9fbb0d715693" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.825508 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.902283 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"204e0c54-6e8b-4cb2-961e-2233030b40f4","Type":"ContainerDied","Data":"5ec58751ad7f6f7566129c8ece4bf0491aa15add50449cc18b15818e1c4113d0"} Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.902318 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:16 crc kubenswrapper[4822]: E1010 06:43:16.905607 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-plp78" podUID="a1de65e8-9721-4039-8c67-9fbb0d715693" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.987330 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-config-data\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.987386 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-combined-ca-bundle\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.987426 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-httpd-run\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.987497 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dknhj\" (UniqueName: \"kubernetes.io/projected/204e0c54-6e8b-4cb2-961e-2233030b40f4-kube-api-access-dknhj\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.987994 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989014 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-scripts\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989051 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-internal-tls-certs\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989084 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-logs\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989110 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"204e0c54-6e8b-4cb2-961e-2233030b40f4\" (UID: \"204e0c54-6e8b-4cb2-961e-2233030b40f4\") " Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989364 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-logs" (OuterVolumeSpecName: "logs") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989951 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.989978 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204e0c54-6e8b-4cb2-961e-2233030b40f4-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.992262 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.992903 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-scripts" (OuterVolumeSpecName: "scripts") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:16 crc kubenswrapper[4822]: I1010 06:43:16.993910 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204e0c54-6e8b-4cb2-961e-2233030b40f4-kube-api-access-dknhj" (OuterVolumeSpecName: "kube-api-access-dknhj") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "kube-api-access-dknhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.014160 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.036837 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.038735 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-config-data" (OuterVolumeSpecName: "config-data") pod "204e0c54-6e8b-4cb2-961e-2233030b40f4" (UID: "204e0c54-6e8b-4cb2-961e-2233030b40f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.091149 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.091192 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.091212 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.091224 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.091241 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204e0c54-6e8b-4cb2-961e-2233030b40f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.091255 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dknhj\" (UniqueName: \"kubernetes.io/projected/204e0c54-6e8b-4cb2-961e-2233030b40f4-kube-api-access-dknhj\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.128075 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.192448 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.248378 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.272163 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.285970 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:17 crc kubenswrapper[4822]: E1010 06:43:17.286452 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-httpd" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.286476 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-httpd" Oct 10 06:43:17 crc kubenswrapper[4822]: E1010 06:43:17.286498 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-log" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.286506 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-log" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.286704 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-httpd" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.286741 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" containerName="glance-log" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.288442 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.290739 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.291535 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.297566 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395280 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395322 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395349 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395381 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxp9\" (UniqueName: \"kubernetes.io/projected/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-kube-api-access-6qxp9\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395401 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395422 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395470 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.395492 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.496995 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497079 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497216 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497255 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497298 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497355 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxp9\" (UniqueName: \"kubernetes.io/projected/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-kube-api-access-6qxp9\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497398 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.497437 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.498260 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.498313 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.498399 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.502139 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.502285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.503600 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.506602 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.522107 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxp9\" (UniqueName: \"kubernetes.io/projected/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-kube-api-access-6qxp9\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.526162 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.617130 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.660893 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204e0c54-6e8b-4cb2-961e-2233030b40f4" path="/var/lib/kubelet/pods/204e0c54-6e8b-4cb2-961e-2233030b40f4/volumes" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.943968 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2x4k7" event={"ID":"e6ea6247-cd17-4b6e-b71f-b916243da4b6","Type":"ContainerDied","Data":"ce135713c23aab06b42d6c8f97a8e2a499ae3c20864466bf0e3b3865558cedc0"} Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.944277 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce135713c23aab06b42d6c8f97a8e2a499ae3c20864466bf0e3b3865558cedc0" Oct 10 06:43:17 crc kubenswrapper[4822]: E1010 06:43:17.959222 4822 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 10 06:43:17 crc kubenswrapper[4822]: E1010 06:43:17.959416 4822 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmngr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-t5dtz_openstack(41b14558-f019-4f51-a3ab-b5689de6336a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 06:43:17 crc kubenswrapper[4822]: E1010 06:43:17.960999 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-t5dtz" podUID="41b14558-f019-4f51-a3ab-b5689de6336a" Oct 10 06:43:17 crc kubenswrapper[4822]: I1010 06:43:17.972165 4822 scope.go:117] "RemoveContainer" containerID="44872c57530e80362bc4d529842e8600690e5ab439f2a56cd4ae26f0210dbacf" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.108010 4822 scope.go:117] "RemoveContainer" containerID="6393c6148e1ccaed1dfc18210b26d7ce34617da6722f27236a94d2515ff58d03" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.108198 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.167614 4822 scope.go:117] "RemoveContainer" containerID="16d4ffadb84b60d504c9b8da0cb9aafee3acad779b7420618b961baaa7bfdfb9" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.211935 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbr9r\" (UniqueName: \"kubernetes.io/projected/e6ea6247-cd17-4b6e-b71f-b916243da4b6-kube-api-access-pbr9r\") pod \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.212078 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-nb\") pod \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.212206 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-dns-svc\") pod \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.212278 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-config\") pod \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.212314 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-sb\") pod \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\" (UID: \"e6ea6247-cd17-4b6e-b71f-b916243da4b6\") " Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.223022 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ea6247-cd17-4b6e-b71f-b916243da4b6-kube-api-access-pbr9r" (OuterVolumeSpecName: "kube-api-access-pbr9r") pod "e6ea6247-cd17-4b6e-b71f-b916243da4b6" (UID: "e6ea6247-cd17-4b6e-b71f-b916243da4b6"). InnerVolumeSpecName "kube-api-access-pbr9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.312127 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6ea6247-cd17-4b6e-b71f-b916243da4b6" (UID: "e6ea6247-cd17-4b6e-b71f-b916243da4b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.312166 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6ea6247-cd17-4b6e-b71f-b916243da4b6" (UID: "e6ea6247-cd17-4b6e-b71f-b916243da4b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.314470 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.314507 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.314617 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbr9r\" (UniqueName: \"kubernetes.io/projected/e6ea6247-cd17-4b6e-b71f-b916243da4b6-kube-api-access-pbr9r\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.320879 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6ea6247-cd17-4b6e-b71f-b916243da4b6" (UID: "e6ea6247-cd17-4b6e-b71f-b916243da4b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.321532 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-config" (OuterVolumeSpecName: "config") pod "e6ea6247-cd17-4b6e-b71f-b916243da4b6" (UID: "e6ea6247-cd17-4b6e-b71f-b916243da4b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.415988 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.416038 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6ea6247-cd17-4b6e-b71f-b916243da4b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.447870 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mkfd6"] Oct 10 06:43:18 crc kubenswrapper[4822]: W1010 06:43:18.455663 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c095c1b_93a3_4c9f_bea7_1d7e6310d06a.slice/crio-b7d8c785d01b75b6e6d37d0a8902c399ff828a30d764f6f35ef4fd72db20b78f WatchSource:0}: Error finding container b7d8c785d01b75b6e6d37d0a8902c399ff828a30d764f6f35ef4fd72db20b78f: Status 404 returned error can't find the container with id b7d8c785d01b75b6e6d37d0a8902c399ff828a30d764f6f35ef4fd72db20b78f Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.597239 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.958184 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jr42" event={"ID":"c82a5873-895d-4f27-ab6b-c264a59949a6","Type":"ContainerStarted","Data":"0141cd260ea12894ec8962858e0a4a76f6db81c7131a7c678f8f6be121b43c25"} Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.961113 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerStarted","Data":"9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e"} Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.964576 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mkfd6" event={"ID":"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a","Type":"ContainerStarted","Data":"dd2a30b180ec632a6893d74f2fdd44fa6c7795beb5d5790d7be79ba27adf7c63"} Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.964617 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mkfd6" event={"ID":"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a","Type":"ContainerStarted","Data":"b7d8c785d01b75b6e6d37d0a8902c399ff828a30d764f6f35ef4fd72db20b78f"} Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.966013 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"93b88728-7582-4106-8d27-cf2644ca1960","Type":"ContainerStarted","Data":"94305d8fe92a8e1899f3745ac1a95b787ab2309fd904d0ea7066c788bba80e0b"} Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.968274 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqsbv" event={"ID":"3d3156ed-785b-492d-923f-cbd97a996b43","Type":"ContainerStarted","Data":"024bb9bd5ec88357a6f36b69c390ab51145b978575520dba76b734934c4cb667"} Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.968406 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2x4k7" Oct 10 06:43:18 crc kubenswrapper[4822]: E1010 06:43:18.971762 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-t5dtz" podUID="41b14558-f019-4f51-a3ab-b5689de6336a" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.977605 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8jr42" podStartSLOduration=3.765959676 podStartE2EDuration="20.977589751s" podCreationTimestamp="2025-10-10 06:42:58 +0000 UTC" firstStartedPulling="2025-10-10 06:42:59.5276312 +0000 UTC m=+1126.622789396" lastFinishedPulling="2025-10-10 06:43:16.739261275 +0000 UTC m=+1143.834419471" observedRunningTime="2025-10-10 06:43:18.976595222 +0000 UTC m=+1146.071753428" watchObservedRunningTime="2025-10-10 06:43:18.977589751 +0000 UTC m=+1146.072747957" Oct 10 06:43:18 crc kubenswrapper[4822]: I1010 06:43:18.998794 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qqsbv" podStartSLOduration=17.998775206 podStartE2EDuration="17.998775206s" podCreationTimestamp="2025-10-10 06:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:18.995223792 +0000 UTC m=+1146.090381998" watchObservedRunningTime="2025-10-10 06:43:18.998775206 +0000 UTC m=+1146.093933412" Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.032162 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mkfd6" podStartSLOduration=12.032147074 podStartE2EDuration="12.032147074s" podCreationTimestamp="2025-10-10 06:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:19.027063086 +0000 UTC m=+1146.122221292" watchObservedRunningTime="2025-10-10 06:43:19.032147074 +0000 UTC m=+1146.127305270" Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.045206 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2x4k7"] Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.051770 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2x4k7"] Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.507069 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.665258 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" path="/var/lib/kubelet/pods/e6ea6247-cd17-4b6e-b71f-b916243da4b6/volumes" Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.977629 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b3ce8f2-0273-4f1c-b9ab-23b327461cce","Type":"ContainerStarted","Data":"df54d7695e6d9e19dfe5b752cb8933d915e122251669c1e7350d7995c242c312"} Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.986279 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"93b88728-7582-4106-8d27-cf2644ca1960","Type":"ContainerStarted","Data":"bbc088dc97d4b3f85b5d015621c645d96e6edf8aab87d72f1d14921dc62dc983"} Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.986335 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"93b88728-7582-4106-8d27-cf2644ca1960","Type":"ContainerStarted","Data":"573ef14ace4b4f889d3c0ab84ad4f3bfe8b82904867dcb11b92e9e04da75f069"} Oct 10 06:43:19 crc kubenswrapper[4822]: I1010 06:43:19.990758 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerStarted","Data":"6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132"} Oct 10 06:43:21 crc kubenswrapper[4822]: I1010 06:43:21.018625 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.01860682 podStartE2EDuration="12.01860682s" podCreationTimestamp="2025-10-10 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:21.016764957 +0000 UTC m=+1148.111923173" watchObservedRunningTime="2025-10-10 06:43:21.01860682 +0000 UTC m=+1148.113765016" Oct 10 06:43:22 crc kubenswrapper[4822]: I1010 06:43:22.008321 4822 generic.go:334] "Generic (PLEG): container finished" podID="c82a5873-895d-4f27-ab6b-c264a59949a6" containerID="0141cd260ea12894ec8962858e0a4a76f6db81c7131a7c678f8f6be121b43c25" exitCode=0 Oct 10 06:43:22 crc kubenswrapper[4822]: I1010 06:43:22.008394 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jr42" event={"ID":"c82a5873-895d-4f27-ab6b-c264a59949a6","Type":"ContainerDied","Data":"0141cd260ea12894ec8962858e0a4a76f6db81c7131a7c678f8f6be121b43c25"} Oct 10 06:43:22 crc kubenswrapper[4822]: I1010 06:43:22.011828 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b3ce8f2-0273-4f1c-b9ab-23b327461cce","Type":"ContainerStarted","Data":"b4aa99e351481e0bb7dbfcc43395a335eb68b03ff6aafce0ff40b4b34ffeb8ba"} Oct 10 06:43:22 crc kubenswrapper[4822]: I1010 06:43:22.011861 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b3ce8f2-0273-4f1c-b9ab-23b327461cce","Type":"ContainerStarted","Data":"f0f97deab7199ec122d10381e74550fceb86eea97acef347ad6e88fcb9754e6a"} Oct 10 06:43:22 crc kubenswrapper[4822]: I1010 06:43:22.048701 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.048679933 podStartE2EDuration="5.048679933s" podCreationTimestamp="2025-10-10 06:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:22.041947387 +0000 UTC m=+1149.137105603" watchObservedRunningTime="2025-10-10 06:43:22.048679933 +0000 UTC m=+1149.143838149" Oct 10 06:43:23 crc kubenswrapper[4822]: I1010 06:43:23.024761 4822 generic.go:334] "Generic (PLEG): container finished" podID="8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" containerID="dd2a30b180ec632a6893d74f2fdd44fa6c7795beb5d5790d7be79ba27adf7c63" exitCode=0 Oct 10 06:43:23 crc kubenswrapper[4822]: I1010 06:43:23.024983 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mkfd6" event={"ID":"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a","Type":"ContainerDied","Data":"dd2a30b180ec632a6893d74f2fdd44fa6c7795beb5d5790d7be79ba27adf7c63"} Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.877489 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.884349 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jr42" Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997229 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-combined-ca-bundle\") pod \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997277 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-combined-ca-bundle\") pod \"c82a5873-895d-4f27-ab6b-c264a59949a6\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997306 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsd95\" (UniqueName: \"kubernetes.io/projected/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-kube-api-access-lsd95\") pod \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997349 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkfcd\" (UniqueName: \"kubernetes.io/projected/c82a5873-895d-4f27-ab6b-c264a59949a6-kube-api-access-kkfcd\") pod \"c82a5873-895d-4f27-ab6b-c264a59949a6\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997433 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-scripts\") pod \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997481 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-config-data\") pod \"c82a5873-895d-4f27-ab6b-c264a59949a6\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997514 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82a5873-895d-4f27-ab6b-c264a59949a6-logs\") pod \"c82a5873-895d-4f27-ab6b-c264a59949a6\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997534 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-fernet-keys\") pod \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997582 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-scripts\") pod \"c82a5873-895d-4f27-ab6b-c264a59949a6\" (UID: \"c82a5873-895d-4f27-ab6b-c264a59949a6\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997614 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-config-data\") pod \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.997635 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-credential-keys\") pod \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\" (UID: \"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a\") " Oct 10 06:43:26 crc kubenswrapper[4822]: I1010 06:43:26.999395 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82a5873-895d-4f27-ab6b-c264a59949a6-logs" (OuterVolumeSpecName: "logs") pod "c82a5873-895d-4f27-ab6b-c264a59949a6" (UID: "c82a5873-895d-4f27-ab6b-c264a59949a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.003196 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-scripts" (OuterVolumeSpecName: "scripts") pod "c82a5873-895d-4f27-ab6b-c264a59949a6" (UID: "c82a5873-895d-4f27-ab6b-c264a59949a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.003521 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-scripts" (OuterVolumeSpecName: "scripts") pod "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" (UID: "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.004080 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" (UID: "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.004099 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-kube-api-access-lsd95" (OuterVolumeSpecName: "kube-api-access-lsd95") pod "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" (UID: "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a"). InnerVolumeSpecName "kube-api-access-lsd95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.004358 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82a5873-895d-4f27-ab6b-c264a59949a6-kube-api-access-kkfcd" (OuterVolumeSpecName: "kube-api-access-kkfcd") pod "c82a5873-895d-4f27-ab6b-c264a59949a6" (UID: "c82a5873-895d-4f27-ab6b-c264a59949a6"). InnerVolumeSpecName "kube-api-access-kkfcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.016569 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" (UID: "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.026327 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c82a5873-895d-4f27-ab6b-c264a59949a6" (UID: "c82a5873-895d-4f27-ab6b-c264a59949a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.026690 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-config-data" (OuterVolumeSpecName: "config-data") pod "c82a5873-895d-4f27-ab6b-c264a59949a6" (UID: "c82a5873-895d-4f27-ab6b-c264a59949a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.027646 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-config-data" (OuterVolumeSpecName: "config-data") pod "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" (UID: "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.029027 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" (UID: "8c095c1b-93a3-4c9f-bea7-1d7e6310d06a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.070847 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jr42" event={"ID":"c82a5873-895d-4f27-ab6b-c264a59949a6","Type":"ContainerDied","Data":"84f361ce6cce96faecb70569b3a66ff06917175bc7bb74640af7a9be4339afa8"} Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.070880 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jr42" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.070904 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84f361ce6cce96faecb70569b3a66ff06917175bc7bb74640af7a9be4339afa8" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.073911 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerStarted","Data":"cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad"} Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.075309 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mkfd6" event={"ID":"8c095c1b-93a3-4c9f-bea7-1d7e6310d06a","Type":"ContainerDied","Data":"b7d8c785d01b75b6e6d37d0a8902c399ff828a30d764f6f35ef4fd72db20b78f"} Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.075477 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7d8c785d01b75b6e6d37d0a8902c399ff828a30d764f6f35ef4fd72db20b78f" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.075367 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mkfd6" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098839 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098865 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098874 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82a5873-895d-4f27-ab6b-c264a59949a6-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098882 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098892 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098903 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098914 4822 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098924 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098935 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82a5873-895d-4f27-ab6b-c264a59949a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098945 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsd95\" (UniqueName: \"kubernetes.io/projected/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a-kube-api-access-lsd95\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.098956 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkfcd\" (UniqueName: \"kubernetes.io/projected/c82a5873-895d-4f27-ab6b-c264a59949a6-kube-api-access-kkfcd\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.618721 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.618782 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.674494 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:27 crc kubenswrapper[4822]: I1010 06:43:27.676429 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.006272 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b98d46cf9-66pcm"] Oct 10 06:43:28 crc kubenswrapper[4822]: E1010 06:43:28.007023 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="init" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007038 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="init" Oct 10 06:43:28 crc kubenswrapper[4822]: E1010 06:43:28.007065 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" containerName="keystone-bootstrap" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007073 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" containerName="keystone-bootstrap" Oct 10 06:43:28 crc kubenswrapper[4822]: E1010 06:43:28.007086 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="dnsmasq-dns" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007096 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="dnsmasq-dns" Oct 10 06:43:28 crc kubenswrapper[4822]: E1010 06:43:28.007116 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82a5873-895d-4f27-ab6b-c264a59949a6" containerName="placement-db-sync" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007123 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82a5873-895d-4f27-ab6b-c264a59949a6" containerName="placement-db-sync" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007310 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82a5873-895d-4f27-ab6b-c264a59949a6" containerName="placement-db-sync" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007328 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" containerName="keystone-bootstrap" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.007350 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ea6247-cd17-4b6e-b71f-b916243da4b6" containerName="dnsmasq-dns" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.008072 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.012238 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.012259 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.012240 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.013204 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.013349 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.017065 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-x4v4k" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.020235 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b98d46cf9-66pcm"] Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.088184 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.088219 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.090074 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6fc97c446d-qd577"] Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.091835 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.094702 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.095035 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.095107 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-974cx" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.095296 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.095320 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.111357 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fc97c446d-qd577"] Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115060 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-public-tls-certs\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115101 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-combined-ca-bundle\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115152 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pxn5\" (UniqueName: \"kubernetes.io/projected/dc3ce4fd-4bba-4242-91ba-076cf3729770-kube-api-access-4pxn5\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115220 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-internal-tls-certs\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115267 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-scripts\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115295 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-config-data\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115336 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-fernet-keys\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.115350 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-credential-keys\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.216762 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-combined-ca-bundle\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.216844 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qqx\" (UniqueName: \"kubernetes.io/projected/3420c1f4-bf0d-4de6-90a4-c00e0722d911-kube-api-access-57qqx\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.216887 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxn5\" (UniqueName: \"kubernetes.io/projected/dc3ce4fd-4bba-4242-91ba-076cf3729770-kube-api-access-4pxn5\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.216933 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420c1f4-bf0d-4de6-90a4-c00e0722d911-logs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.216977 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-scripts\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217044 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-internal-tls-certs\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217074 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-internal-tls-certs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217111 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-combined-ca-bundle\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217163 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-public-tls-certs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217204 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-scripts\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217248 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-config-data\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217281 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-config-data\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217313 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-credential-keys\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217343 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-fernet-keys\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.217383 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-public-tls-certs\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.223582 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-fernet-keys\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.223734 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-combined-ca-bundle\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.225119 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-public-tls-certs\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.225510 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-scripts\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.227481 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-internal-tls-certs\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.228755 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-credential-keys\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.228987 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-config-data\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.238761 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxn5\" (UniqueName: \"kubernetes.io/projected/dc3ce4fd-4bba-4242-91ba-076cf3729770-kube-api-access-4pxn5\") pod \"keystone-5b98d46cf9-66pcm\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318622 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420c1f4-bf0d-4de6-90a4-c00e0722d911-logs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318710 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-scripts\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318759 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-internal-tls-certs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318785 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-combined-ca-bundle\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-public-tls-certs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318869 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-config-data\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.318911 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57qqx\" (UniqueName: \"kubernetes.io/projected/3420c1f4-bf0d-4de6-90a4-c00e0722d911-kube-api-access-57qqx\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.319111 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420c1f4-bf0d-4de6-90a4-c00e0722d911-logs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.322559 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-scripts\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.322884 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-config-data\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.323056 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-public-tls-certs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.323323 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-internal-tls-certs\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.330555 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.332428 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-combined-ca-bundle\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.335181 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qqx\" (UniqueName: \"kubernetes.io/projected/3420c1f4-bf0d-4de6-90a4-c00e0722d911-kube-api-access-57qqx\") pod \"placement-6fc97c446d-qd577\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.430792 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:28 crc kubenswrapper[4822]: I1010 06:43:28.875099 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b98d46cf9-66pcm"] Oct 10 06:43:29 crc kubenswrapper[4822]: I1010 06:43:29.001396 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fc97c446d-qd577"] Oct 10 06:43:29 crc kubenswrapper[4822]: W1010 06:43:29.003214 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3420c1f4_bf0d_4de6_90a4_c00e0722d911.slice/crio-a86813d124296ecd04e3e363249f2055025a09990fd535e0d96795cb4849ed4b WatchSource:0}: Error finding container a86813d124296ecd04e3e363249f2055025a09990fd535e0d96795cb4849ed4b: Status 404 returned error can't find the container with id a86813d124296ecd04e3e363249f2055025a09990fd535e0d96795cb4849ed4b Oct 10 06:43:29 crc kubenswrapper[4822]: I1010 06:43:29.097434 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc97c446d-qd577" event={"ID":"3420c1f4-bf0d-4de6-90a4-c00e0722d911","Type":"ContainerStarted","Data":"a86813d124296ecd04e3e363249f2055025a09990fd535e0d96795cb4849ed4b"} Oct 10 06:43:29 crc kubenswrapper[4822]: I1010 06:43:29.099780 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b98d46cf9-66pcm" event={"ID":"dc3ce4fd-4bba-4242-91ba-076cf3729770","Type":"ContainerStarted","Data":"42176a300fc8047fe12fa1c102b7a1082e60513284de2910e9936d11a415c9bc"} Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.114289 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b98d46cf9-66pcm" event={"ID":"dc3ce4fd-4bba-4242-91ba-076cf3729770","Type":"ContainerStarted","Data":"aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6"} Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.115182 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.120521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc97c446d-qd577" event={"ID":"3420c1f4-bf0d-4de6-90a4-c00e0722d911","Type":"ContainerStarted","Data":"a1f35f0eceeefdb49f84dae81b99596158510c253b94b22659bccc55d2420d00"} Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.121051 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc97c446d-qd577" event={"ID":"3420c1f4-bf0d-4de6-90a4-c00e0722d911","Type":"ContainerStarted","Data":"f37aae3b6d57636153fe7116fe10389d8dc0009af7d1ea6600168de93249a807"} Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.121358 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.121497 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.142348 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b98d46cf9-66pcm" podStartSLOduration=3.142325148 podStartE2EDuration="3.142325148s" podCreationTimestamp="2025-10-10 06:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:30.139562578 +0000 UTC m=+1157.234720824" watchObservedRunningTime="2025-10-10 06:43:30.142325148 +0000 UTC m=+1157.237483344" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.174681 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6fc97c446d-qd577" podStartSLOduration=2.174652456 podStartE2EDuration="2.174652456s" podCreationTimestamp="2025-10-10 06:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:30.164716598 +0000 UTC m=+1157.259874794" watchObservedRunningTime="2025-10-10 06:43:30.174652456 +0000 UTC m=+1157.269810652" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.215900 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.216052 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.222811 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.222871 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.273783 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.279389 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 06:43:30 crc kubenswrapper[4822]: I1010 06:43:30.359328 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.133745 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-plp78" event={"ID":"a1de65e8-9721-4039-8c67-9fbb0d715693","Type":"ContainerStarted","Data":"f839fda56f2538723eb3a69164f746c8534c1810606224e142e99f555c972316"} Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.135443 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.135477 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.154650 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-plp78" podStartSLOduration=2.351246131 podStartE2EDuration="33.154634055s" podCreationTimestamp="2025-10-10 06:42:58 +0000 UTC" firstStartedPulling="2025-10-10 06:42:59.341377835 +0000 UTC m=+1126.436536031" lastFinishedPulling="2025-10-10 06:43:30.144765759 +0000 UTC m=+1157.239923955" observedRunningTime="2025-10-10 06:43:31.149700151 +0000 UTC m=+1158.244858347" watchObservedRunningTime="2025-10-10 06:43:31.154634055 +0000 UTC m=+1158.249792251" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.337252 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.337332 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.337381 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.338159 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c43ffe7835942c8ad421a26f02d63196eea012c628c464759216b8f8a59f7812"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:43:31 crc kubenswrapper[4822]: I1010 06:43:31.338232 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://c43ffe7835942c8ad421a26f02d63196eea012c628c464759216b8f8a59f7812" gracePeriod=600 Oct 10 06:43:32 crc kubenswrapper[4822]: I1010 06:43:32.148743 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="c43ffe7835942c8ad421a26f02d63196eea012c628c464759216b8f8a59f7812" exitCode=0 Oct 10 06:43:32 crc kubenswrapper[4822]: I1010 06:43:32.149745 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"c43ffe7835942c8ad421a26f02d63196eea012c628c464759216b8f8a59f7812"} Oct 10 06:43:32 crc kubenswrapper[4822]: I1010 06:43:32.149816 4822 scope.go:117] "RemoveContainer" containerID="3148af7555cf9f4072513a4f7349d4cc748c64df0fab673b49a83ef0fc2fe122" Oct 10 06:43:33 crc kubenswrapper[4822]: I1010 06:43:33.106922 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 06:43:33 crc kubenswrapper[4822]: I1010 06:43:33.109575 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.180709 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerStarted","Data":"a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578"} Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.181235 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.180890 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="proxy-httpd" containerID="cri-o://a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578" gracePeriod=30 Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.180859 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-central-agent" containerID="cri-o://9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e" gracePeriod=30 Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.180912 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-notification-agent" containerID="cri-o://6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132" gracePeriod=30 Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.180901 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="sg-core" containerID="cri-o://cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad" gracePeriod=30 Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.183647 4822 generic.go:334] "Generic (PLEG): container finished" podID="a1de65e8-9721-4039-8c67-9fbb0d715693" containerID="f839fda56f2538723eb3a69164f746c8534c1810606224e142e99f555c972316" exitCode=0 Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.183720 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-plp78" event={"ID":"a1de65e8-9721-4039-8c67-9fbb0d715693","Type":"ContainerDied","Data":"f839fda56f2538723eb3a69164f746c8534c1810606224e142e99f555c972316"} Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.187767 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"006f8177d378a019f93f20e514e90a0268748c4dd87f7ee989c03c088b0112a8"} Oct 10 06:43:35 crc kubenswrapper[4822]: I1010 06:43:35.204924 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.645924253 podStartE2EDuration="37.204902403s" podCreationTimestamp="2025-10-10 06:42:58 +0000 UTC" firstStartedPulling="2025-10-10 06:42:59.179575349 +0000 UTC m=+1126.274733545" lastFinishedPulling="2025-10-10 06:43:34.738553479 +0000 UTC m=+1161.833711695" observedRunningTime="2025-10-10 06:43:35.200873096 +0000 UTC m=+1162.296031292" watchObservedRunningTime="2025-10-10 06:43:35.204902403 +0000 UTC m=+1162.300060599" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.197952 4822 generic.go:334] "Generic (PLEG): container finished" podID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerID="a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578" exitCode=0 Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.199361 4822 generic.go:334] "Generic (PLEG): container finished" podID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerID="cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad" exitCode=2 Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.199433 4822 generic.go:334] "Generic (PLEG): container finished" podID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerID="9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e" exitCode=0 Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.198061 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerDied","Data":"a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578"} Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.199672 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerDied","Data":"cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad"} Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.199741 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerDied","Data":"9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e"} Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.201640 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t5dtz" event={"ID":"41b14558-f019-4f51-a3ab-b5689de6336a","Type":"ContainerStarted","Data":"83166d2b466bbc334a7a76449841bb9af7bb6da59d0f938b0df0671307d8888b"} Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.204162 4822 generic.go:334] "Generic (PLEG): container finished" podID="3d3156ed-785b-492d-923f-cbd97a996b43" containerID="024bb9bd5ec88357a6f36b69c390ab51145b978575520dba76b734934c4cb667" exitCode=0 Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.204289 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqsbv" event={"ID":"3d3156ed-785b-492d-923f-cbd97a996b43","Type":"ContainerDied","Data":"024bb9bd5ec88357a6f36b69c390ab51145b978575520dba76b734934c4cb667"} Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.220722 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-t5dtz" podStartSLOduration=2.54555444 podStartE2EDuration="38.220702961s" podCreationTimestamp="2025-10-10 06:42:58 +0000 UTC" firstStartedPulling="2025-10-10 06:42:59.056153977 +0000 UTC m=+1126.151312173" lastFinishedPulling="2025-10-10 06:43:34.731302498 +0000 UTC m=+1161.826460694" observedRunningTime="2025-10-10 06:43:36.217962862 +0000 UTC m=+1163.313121068" watchObservedRunningTime="2025-10-10 06:43:36.220702961 +0000 UTC m=+1163.315861157" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.550101 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-plp78" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.614887 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.678528 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-combined-ca-bundle\") pod \"a1de65e8-9721-4039-8c67-9fbb0d715693\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.678585 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-db-sync-config-data\") pod \"a1de65e8-9721-4039-8c67-9fbb0d715693\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.678635 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g447g\" (UniqueName: \"kubernetes.io/projected/a1de65e8-9721-4039-8c67-9fbb0d715693-kube-api-access-g447g\") pod \"a1de65e8-9721-4039-8c67-9fbb0d715693\" (UID: \"a1de65e8-9721-4039-8c67-9fbb0d715693\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.684912 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1de65e8-9721-4039-8c67-9fbb0d715693-kube-api-access-g447g" (OuterVolumeSpecName: "kube-api-access-g447g") pod "a1de65e8-9721-4039-8c67-9fbb0d715693" (UID: "a1de65e8-9721-4039-8c67-9fbb0d715693"). InnerVolumeSpecName "kube-api-access-g447g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.685048 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a1de65e8-9721-4039-8c67-9fbb0d715693" (UID: "a1de65e8-9721-4039-8c67-9fbb0d715693"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.706868 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1de65e8-9721-4039-8c67-9fbb0d715693" (UID: "a1de65e8-9721-4039-8c67-9fbb0d715693"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.779936 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-combined-ca-bundle\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780017 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-sg-core-conf-yaml\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780090 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-run-httpd\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780127 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-scripts\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780154 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-log-httpd\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780226 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xckl5\" (UniqueName: \"kubernetes.io/projected/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-kube-api-access-xckl5\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780262 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-config-data\") pod \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\" (UID: \"be4da4cf-3be6-4ea1-a01c-7c7333816cb8\") " Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780486 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780875 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780925 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780945 4822 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a1de65e8-9721-4039-8c67-9fbb0d715693-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780958 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g447g\" (UniqueName: \"kubernetes.io/projected/a1de65e8-9721-4039-8c67-9fbb0d715693-kube-api-access-g447g\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.780972 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.783561 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-kube-api-access-xckl5" (OuterVolumeSpecName: "kube-api-access-xckl5") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "kube-api-access-xckl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.788239 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-scripts" (OuterVolumeSpecName: "scripts") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.812145 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.844231 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.866677 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-config-data" (OuterVolumeSpecName: "config-data") pod "be4da4cf-3be6-4ea1-a01c-7c7333816cb8" (UID: "be4da4cf-3be6-4ea1-a01c-7c7333816cb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.884119 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.884150 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.884159 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.884167 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.884175 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xckl5\" (UniqueName: \"kubernetes.io/projected/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-kube-api-access-xckl5\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:36 crc kubenswrapper[4822]: I1010 06:43:36.884185 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be4da4cf-3be6-4ea1-a01c-7c7333816cb8-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.214675 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-plp78" event={"ID":"a1de65e8-9721-4039-8c67-9fbb0d715693","Type":"ContainerDied","Data":"3de9a178f94bf8bb14804ad873b10ef33bb24938587b177d64e602f3e53ec6b2"} Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.214725 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-plp78" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.214740 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3de9a178f94bf8bb14804ad873b10ef33bb24938587b177d64e602f3e53ec6b2" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.217232 4822 generic.go:334] "Generic (PLEG): container finished" podID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerID="6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132" exitCode=0 Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.217473 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.218431 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerDied","Data":"6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132"} Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.218464 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be4da4cf-3be6-4ea1-a01c-7c7333816cb8","Type":"ContainerDied","Data":"9df45187618bde4d7d9a1b5829bd2f6dd2e19bedc710f4861c99cc55a81b08b6"} Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.218488 4822 scope.go:117] "RemoveContainer" containerID="a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.251577 4822 scope.go:117] "RemoveContainer" containerID="cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.288499 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.290147 4822 scope.go:117] "RemoveContainer" containerID="6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.300323 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.316001 4822 scope.go:117] "RemoveContainer" containerID="9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.326991 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.327667 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1de65e8-9721-4039-8c67-9fbb0d715693" containerName="barbican-db-sync" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.327681 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1de65e8-9721-4039-8c67-9fbb0d715693" containerName="barbican-db-sync" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.327700 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="proxy-httpd" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.327706 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="proxy-httpd" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.327718 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-central-agent" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.327726 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-central-agent" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.327741 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-notification-agent" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.327747 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-notification-agent" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.327774 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="sg-core" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.327779 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="sg-core" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.327987 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="proxy-httpd" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.328000 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-notification-agent" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.328013 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="sg-core" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.328026 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1de65e8-9721-4039-8c67-9fbb0d715693" containerName="barbican-db-sync" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.328034 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" containerName="ceilometer-central-agent" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.329604 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.332119 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.332358 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.336271 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.378331 4822 scope.go:117] "RemoveContainer" containerID="a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.378706 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578\": container with ID starting with a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578 not found: ID does not exist" containerID="a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.378750 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578"} err="failed to get container status \"a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578\": rpc error: code = NotFound desc = could not find container \"a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578\": container with ID starting with a8f1967509330ed200dbda724ff3ef478d0901ca573043134115176012b6d578 not found: ID does not exist" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.378776 4822 scope.go:117] "RemoveContainer" containerID="cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.380553 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad\": container with ID starting with cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad not found: ID does not exist" containerID="cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.380626 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad"} err="failed to get container status \"cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad\": rpc error: code = NotFound desc = could not find container \"cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad\": container with ID starting with cd69e6c1226e7fd757ad686c7b16e801bdf0e67331be7d7e5118cff03fcff1ad not found: ID does not exist" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.380681 4822 scope.go:117] "RemoveContainer" containerID="6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.381038 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132\": container with ID starting with 6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132 not found: ID does not exist" containerID="6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.381093 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132"} err="failed to get container status \"6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132\": rpc error: code = NotFound desc = could not find container \"6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132\": container with ID starting with 6e533e8d348123fcd8a961ebd7a80b25a9f180c9ff5c3c4f37228c369d6d1132 not found: ID does not exist" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.381117 4822 scope.go:117] "RemoveContainer" containerID="9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e" Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.381442 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e\": container with ID starting with 9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e not found: ID does not exist" containerID="9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.381470 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e"} err="failed to get container status \"9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e\": rpc error: code = NotFound desc = could not find container \"9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e\": container with ID starting with 9a54d268ef6c435f8d765880efcddb7d1a2ca206a72c97357f06024120c46a4e not found: ID does not exist" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.494792 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-config-data\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.494925 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.494948 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-run-httpd\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.494970 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6mvk\" (UniqueName: \"kubernetes.io/projected/88d23dde-d004-4a04-983e-bd72574a8c0d-kube-api-access-k6mvk\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.495010 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-scripts\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.495042 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-log-httpd\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.495073 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.527266 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b5444654f-5wp86"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.530849 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.542599 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-58bsc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.542925 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.546978 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-766cf74578-rdxjc"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.548349 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.548753 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.550438 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.576581 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b5444654f-5wp86"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.597735 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-config-data\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.597886 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.597914 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-run-httpd\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.597939 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6mvk\" (UniqueName: \"kubernetes.io/projected/88d23dde-d004-4a04-983e-bd72574a8c0d-kube-api-access-k6mvk\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.597979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-scripts\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.598010 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-log-httpd\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.598042 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.599514 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-run-httpd\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.600124 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-log-httpd\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.603088 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.607727 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.609128 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-config-data\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.613880 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-scripts\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.622665 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-766cf74578-rdxjc"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.622980 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.635197 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6mvk\" (UniqueName: \"kubernetes.io/projected/88d23dde-d004-4a04-983e-bd72574a8c0d-kube-api-access-k6mvk\") pod \"ceilometer-0\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.646100 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.679729 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4da4cf-3be6-4ea1-a01c-7c7333816cb8" path="/var/lib/kubelet/pods/be4da4cf-3be6-4ea1-a01c-7c7333816cb8/volumes" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.696997 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-pfxst"] Oct 10 06:43:37 crc kubenswrapper[4822]: E1010 06:43:37.700412 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3156ed-785b-492d-923f-cbd97a996b43" containerName="neutron-db-sync" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.700442 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3156ed-785b-492d-923f-cbd97a996b43" containerName="neutron-db-sync" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.700643 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3156ed-785b-492d-923f-cbd97a996b43" containerName="neutron-db-sync" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.704844 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-config\") pod \"3d3156ed-785b-492d-923f-cbd97a996b43\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.704927 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb7w2\" (UniqueName: \"kubernetes.io/projected/3d3156ed-785b-492d-923f-cbd97a996b43-kube-api-access-lb7w2\") pod \"3d3156ed-785b-492d-923f-cbd97a996b43\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.704966 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-combined-ca-bundle\") pod \"3d3156ed-785b-492d-923f-cbd97a996b43\" (UID: \"3d3156ed-785b-492d-923f-cbd97a996b43\") " Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705221 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-combined-ca-bundle\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705253 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhbs\" (UniqueName: \"kubernetes.io/projected/3d602476-cde4-435f-93bc-a72c137d1b58-kube-api-access-zhhbs\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705276 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data-custom\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705294 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-combined-ca-bundle\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705311 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d602476-cde4-435f-93bc-a72c137d1b58-logs\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705330 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mfn\" (UniqueName: \"kubernetes.io/projected/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-kube-api-access-d4mfn\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705365 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data-custom\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705384 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705415 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-logs\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.705454 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.708523 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.716184 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3156ed-785b-492d-923f-cbd97a996b43-kube-api-access-lb7w2" (OuterVolumeSpecName: "kube-api-access-lb7w2") pod "3d3156ed-785b-492d-923f-cbd97a996b43" (UID: "3d3156ed-785b-492d-923f-cbd97a996b43"). InnerVolumeSpecName "kube-api-access-lb7w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.725584 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-pfxst"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.765690 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f84f988dd-n6ss7"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.765751 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-config" (OuterVolumeSpecName: "config") pod "3d3156ed-785b-492d-923f-cbd97a996b43" (UID: "3d3156ed-785b-492d-923f-cbd97a996b43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.779445 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.787606 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.802293 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f84f988dd-n6ss7"] Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810447 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-combined-ca-bundle\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810499 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhbs\" (UniqueName: \"kubernetes.io/projected/3d602476-cde4-435f-93bc-a72c137d1b58-kube-api-access-zhhbs\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810540 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data-custom\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810564 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-combined-ca-bundle\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810588 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d602476-cde4-435f-93bc-a72c137d1b58-logs\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810622 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810642 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mfn\" (UniqueName: \"kubernetes.io/projected/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-kube-api-access-d4mfn\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810674 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810702 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data-custom\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810727 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810744 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810771 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlfw6\" (UniqueName: \"kubernetes.io/projected/e364a315-3983-4e5c-8dae-20c45b6261f6-kube-api-access-vlfw6\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810792 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-config\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810836 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-logs\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810888 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.810929 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.811135 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.811153 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb7w2\" (UniqueName: \"kubernetes.io/projected/3d3156ed-785b-492d-923f-cbd97a996b43-kube-api-access-lb7w2\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.814027 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-logs\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.816966 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-combined-ca-bundle\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.820399 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d602476-cde4-435f-93bc-a72c137d1b58-logs\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.823190 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data-custom\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.823407 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data-custom\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.825097 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3156ed-785b-492d-923f-cbd97a996b43" (UID: "3d3156ed-785b-492d-923f-cbd97a996b43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.832095 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-combined-ca-bundle\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.834015 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.834294 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhbs\" (UniqueName: \"kubernetes.io/projected/3d602476-cde4-435f-93bc-a72c137d1b58-kube-api-access-zhhbs\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.837016 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mfn\" (UniqueName: \"kubernetes.io/projected/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-kube-api-access-d4mfn\") pod \"barbican-worker-6b5444654f-5wp86\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.837848 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data\") pod \"barbican-keystone-listener-766cf74578-rdxjc\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.881576 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.912827 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-combined-ca-bundle\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.912888 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.912927 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.912951 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.912974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlfw6\" (UniqueName: \"kubernetes.io/projected/e364a315-3983-4e5c-8dae-20c45b6261f6-kube-api-access-vlfw6\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.912994 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-config\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.913020 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.913054 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfl5\" (UniqueName: \"kubernetes.io/projected/4b206732-dd22-4cb9-a322-ef8ea8021341-kube-api-access-5rfl5\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.913088 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b206732-dd22-4cb9-a322-ef8ea8021341-logs\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.913104 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.913134 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data-custom\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.913186 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3156ed-785b-492d-923f-cbd97a996b43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.914204 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.915103 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-config\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.915197 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.915550 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.924713 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.935476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlfw6\" (UniqueName: \"kubernetes.io/projected/e364a315-3983-4e5c-8dae-20c45b6261f6-kube-api-access-vlfw6\") pod \"dnsmasq-dns-586bdc5f9-pfxst\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:37 crc kubenswrapper[4822]: I1010 06:43:37.935943 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.015185 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b206732-dd22-4cb9-a322-ef8ea8021341-logs\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.015248 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data-custom\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.015307 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-combined-ca-bundle\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.015399 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.015421 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfl5\" (UniqueName: \"kubernetes.io/projected/4b206732-dd22-4cb9-a322-ef8ea8021341-kube-api-access-5rfl5\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.016343 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b206732-dd22-4cb9-a322-ef8ea8021341-logs\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.020097 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.021376 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-combined-ca-bundle\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.029152 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data-custom\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.037914 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfl5\" (UniqueName: \"kubernetes.io/projected/4b206732-dd22-4cb9-a322-ef8ea8021341-kube-api-access-5rfl5\") pod \"barbican-api-6f84f988dd-n6ss7\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.206018 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.234908 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.239044 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqsbv" event={"ID":"3d3156ed-785b-492d-923f-cbd97a996b43","Type":"ContainerDied","Data":"472347ca63ec3b19aad8df0899d083de054f55e7f7d4bae996c9b107551cbbe2"} Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.239077 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472347ca63ec3b19aad8df0899d083de054f55e7f7d4bae996c9b107551cbbe2" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.239167 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqsbv" Oct 10 06:43:38 crc kubenswrapper[4822]: W1010 06:43:38.267419 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88d23dde_d004_4a04_983e_bd72574a8c0d.slice/crio-626a0ad15c171f247406374afa0f611c53b72fe529dfc7eded50bf4d0d5814f4 WatchSource:0}: Error finding container 626a0ad15c171f247406374afa0f611c53b72fe529dfc7eded50bf4d0d5814f4: Status 404 returned error can't find the container with id 626a0ad15c171f247406374afa0f611c53b72fe529dfc7eded50bf4d0d5814f4 Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.272940 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.416620 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-766cf74578-rdxjc"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.431951 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b5444654f-5wp86"] Oct 10 06:43:38 crc kubenswrapper[4822]: W1010 06:43:38.433233 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11c95228_48ad_4e25_9cf7_bf0a2a1e4c69.slice/crio-283e16ed7fc3f8e422839e721a64fe525ce7ce214a51cdf500c1ed137b76e879 WatchSource:0}: Error finding container 283e16ed7fc3f8e422839e721a64fe525ce7ce214a51cdf500c1ed137b76e879: Status 404 returned error can't find the container with id 283e16ed7fc3f8e422839e721a64fe525ce7ce214a51cdf500c1ed137b76e879 Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.513164 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-pfxst"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.542182 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vqkh9"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.545205 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.561064 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vqkh9"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.580031 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-pfxst"] Oct 10 06:43:38 crc kubenswrapper[4822]: W1010 06:43:38.595817 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode364a315_3983_4e5c_8dae_20c45b6261f6.slice/crio-c1262dc6fb402568c4b3cf6697a5ced20f7434301ff95612eb829ea4d7432b42 WatchSource:0}: Error finding container c1262dc6fb402568c4b3cf6697a5ced20f7434301ff95612eb829ea4d7432b42: Status 404 returned error can't find the container with id c1262dc6fb402568c4b3cf6697a5ced20f7434301ff95612eb829ea4d7432b42 Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.728858 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d4866c7b-kfb67"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.730762 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.732821 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.732956 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733096 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbrm2\" (UniqueName: \"kubernetes.io/projected/daf88505-dfad-4284-b11d-317a10774ad5-kube-api-access-fbrm2\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733178 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733194 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8snl8" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733327 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-config\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733583 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733672 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.733837 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.752007 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d4866c7b-kfb67"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.834992 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-ovndb-tls-certs\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.835101 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbrm2\" (UniqueName: \"kubernetes.io/projected/daf88505-dfad-4284-b11d-317a10774ad5-kube-api-access-fbrm2\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.835129 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-combined-ca-bundle\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.835178 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.835276 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-config\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.835322 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.835374 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnzj\" (UniqueName: \"kubernetes.io/projected/fd0c36cc-5309-4e57-a9fd-1aecd344b833-kube-api-access-zcnzj\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.837037 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.837414 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-config\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.838898 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.838985 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-config\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.839181 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.839221 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-httpd-config\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.839287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.840104 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.843062 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.853967 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbrm2\" (UniqueName: \"kubernetes.io/projected/daf88505-dfad-4284-b11d-317a10774ad5-kube-api-access-fbrm2\") pod \"dnsmasq-dns-85ff748b95-vqkh9\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.870184 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f84f988dd-n6ss7"] Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.872788 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:38 crc kubenswrapper[4822]: W1010 06:43:38.901696 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b206732_dd22_4cb9_a322_ef8ea8021341.slice/crio-6bda34b90f45e883198933582f0f051164d6966ba3c88d7af317e572dbd9e67a WatchSource:0}: Error finding container 6bda34b90f45e883198933582f0f051164d6966ba3c88d7af317e572dbd9e67a: Status 404 returned error can't find the container with id 6bda34b90f45e883198933582f0f051164d6966ba3c88d7af317e572dbd9e67a Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.940974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-ovndb-tls-certs\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.941350 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-combined-ca-bundle\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.941467 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnzj\" (UniqueName: \"kubernetes.io/projected/fd0c36cc-5309-4e57-a9fd-1aecd344b833-kube-api-access-zcnzj\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.941513 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-config\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.941549 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-httpd-config\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.944782 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-ovndb-tls-certs\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.945625 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-httpd-config\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.951659 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-config\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.954569 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-combined-ca-bundle\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:38 crc kubenswrapper[4822]: I1010 06:43:38.963388 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnzj\" (UniqueName: \"kubernetes.io/projected/fd0c36cc-5309-4e57-a9fd-1aecd344b833-kube-api-access-zcnzj\") pod \"neutron-7d4866c7b-kfb67\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.080757 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.284990 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f84f988dd-n6ss7" event={"ID":"4b206732-dd22-4cb9-a322-ef8ea8021341","Type":"ContainerStarted","Data":"383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.285240 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f84f988dd-n6ss7" event={"ID":"4b206732-dd22-4cb9-a322-ef8ea8021341","Type":"ContainerStarted","Data":"6bda34b90f45e883198933582f0f051164d6966ba3c88d7af317e572dbd9e67a"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.305953 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b5444654f-5wp86" event={"ID":"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69","Type":"ContainerStarted","Data":"283e16ed7fc3f8e422839e721a64fe525ce7ce214a51cdf500c1ed137b76e879"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.318676 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" event={"ID":"3d602476-cde4-435f-93bc-a72c137d1b58","Type":"ContainerStarted","Data":"20f51c5a34c2dca09531387d9af017473fa5f79978b90eb4eb6296f3d3cdd1ab"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.320264 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" event={"ID":"e364a315-3983-4e5c-8dae-20c45b6261f6","Type":"ContainerStarted","Data":"bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.320291 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" event={"ID":"e364a315-3983-4e5c-8dae-20c45b6261f6","Type":"ContainerStarted","Data":"c1262dc6fb402568c4b3cf6697a5ced20f7434301ff95612eb829ea4d7432b42"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.320404 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" podUID="e364a315-3983-4e5c-8dae-20c45b6261f6" containerName="init" containerID="cri-o://bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8" gracePeriod=10 Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.322381 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerStarted","Data":"626a0ad15c171f247406374afa0f611c53b72fe529dfc7eded50bf4d0d5814f4"} Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.505174 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vqkh9"] Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.733944 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.816816 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d4866c7b-kfb67"] Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.863620 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-svc\") pod \"e364a315-3983-4e5c-8dae-20c45b6261f6\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.863865 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-nb\") pod \"e364a315-3983-4e5c-8dae-20c45b6261f6\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.863987 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-swift-storage-0\") pod \"e364a315-3983-4e5c-8dae-20c45b6261f6\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.864160 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-config\") pod \"e364a315-3983-4e5c-8dae-20c45b6261f6\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.864307 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-sb\") pod \"e364a315-3983-4e5c-8dae-20c45b6261f6\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.864409 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlfw6\" (UniqueName: \"kubernetes.io/projected/e364a315-3983-4e5c-8dae-20c45b6261f6-kube-api-access-vlfw6\") pod \"e364a315-3983-4e5c-8dae-20c45b6261f6\" (UID: \"e364a315-3983-4e5c-8dae-20c45b6261f6\") " Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.879102 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e364a315-3983-4e5c-8dae-20c45b6261f6-kube-api-access-vlfw6" (OuterVolumeSpecName: "kube-api-access-vlfw6") pod "e364a315-3983-4e5c-8dae-20c45b6261f6" (UID: "e364a315-3983-4e5c-8dae-20c45b6261f6"). InnerVolumeSpecName "kube-api-access-vlfw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.930627 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e364a315-3983-4e5c-8dae-20c45b6261f6" (UID: "e364a315-3983-4e5c-8dae-20c45b6261f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.931578 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e364a315-3983-4e5c-8dae-20c45b6261f6" (UID: "e364a315-3983-4e5c-8dae-20c45b6261f6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.946689 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e364a315-3983-4e5c-8dae-20c45b6261f6" (UID: "e364a315-3983-4e5c-8dae-20c45b6261f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.959457 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-config" (OuterVolumeSpecName: "config") pod "e364a315-3983-4e5c-8dae-20c45b6261f6" (UID: "e364a315-3983-4e5c-8dae-20c45b6261f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.972871 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.972899 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.972910 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlfw6\" (UniqueName: \"kubernetes.io/projected/e364a315-3983-4e5c-8dae-20c45b6261f6-kube-api-access-vlfw6\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.972919 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.972927 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:39 crc kubenswrapper[4822]: I1010 06:43:39.978242 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e364a315-3983-4e5c-8dae-20c45b6261f6" (UID: "e364a315-3983-4e5c-8dae-20c45b6261f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.074736 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e364a315-3983-4e5c-8dae-20c45b6261f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.341461 4822 generic.go:334] "Generic (PLEG): container finished" podID="e364a315-3983-4e5c-8dae-20c45b6261f6" containerID="bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8" exitCode=0 Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.341901 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" event={"ID":"e364a315-3983-4e5c-8dae-20c45b6261f6","Type":"ContainerDied","Data":"bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.341934 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" event={"ID":"e364a315-3983-4e5c-8dae-20c45b6261f6","Type":"ContainerDied","Data":"c1262dc6fb402568c4b3cf6697a5ced20f7434301ff95612eb829ea4d7432b42"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.341950 4822 scope.go:117] "RemoveContainer" containerID="bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8" Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.342067 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-pfxst" Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.355723 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerStarted","Data":"504c140e8ee512d9372cfd694641132689ce7cd1e99b2501bbde28cf540967f9"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.355770 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerStarted","Data":"43936f03c357f9c1d154f2aef74025c5597de4588e7e6b291fa4ba2ff5547c84"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.360685 4822 generic.go:334] "Generic (PLEG): container finished" podID="daf88505-dfad-4284-b11d-317a10774ad5" containerID="8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154" exitCode=0 Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.360762 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" event={"ID":"daf88505-dfad-4284-b11d-317a10774ad5","Type":"ContainerDied","Data":"8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.360787 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" event={"ID":"daf88505-dfad-4284-b11d-317a10774ad5","Type":"ContainerStarted","Data":"eee06d1ad7e3048bbf610945043c83b1ecc02561883d7d7ef71ffb0641baa2c6"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.371296 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f84f988dd-n6ss7" event={"ID":"4b206732-dd22-4cb9-a322-ef8ea8021341","Type":"ContainerStarted","Data":"8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.372216 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.372244 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.390083 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d4866c7b-kfb67" event={"ID":"fd0c36cc-5309-4e57-a9fd-1aecd344b833","Type":"ContainerStarted","Data":"f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.391300 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d4866c7b-kfb67" event={"ID":"fd0c36cc-5309-4e57-a9fd-1aecd344b833","Type":"ContainerStarted","Data":"fff7cf015b33f2a5880a93e456e248883822a80932a784f4d93f331c588a08ae"} Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.406854 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-pfxst"] Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.417652 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-pfxst"] Oct 10 06:43:40 crc kubenswrapper[4822]: I1010 06:43:40.422670 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f84f988dd-n6ss7" podStartSLOduration=3.422650761 podStartE2EDuration="3.422650761s" podCreationTimestamp="2025-10-10 06:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:40.410692014 +0000 UTC m=+1167.505850210" watchObservedRunningTime="2025-10-10 06:43:40.422650761 +0000 UTC m=+1167.517808957" Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.091128 4822 scope.go:117] "RemoveContainer" containerID="bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8" Oct 10 06:43:41 crc kubenswrapper[4822]: E1010 06:43:41.092397 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8\": container with ID starting with bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8 not found: ID does not exist" containerID="bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8" Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.092487 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8"} err="failed to get container status \"bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8\": rpc error: code = NotFound desc = could not find container \"bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8\": container with ID starting with bc4153fe6b96a956ea84193ccc2c85a9227f83ee1f28281fb7ee3ddaf6c863b8 not found: ID does not exist" Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.400293 4822 generic.go:334] "Generic (PLEG): container finished" podID="41b14558-f019-4f51-a3ab-b5689de6336a" containerID="83166d2b466bbc334a7a76449841bb9af7bb6da59d0f938b0df0671307d8888b" exitCode=0 Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.400399 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t5dtz" event={"ID":"41b14558-f019-4f51-a3ab-b5689de6336a","Type":"ContainerDied","Data":"83166d2b466bbc334a7a76449841bb9af7bb6da59d0f938b0df0671307d8888b"} Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.403891 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d4866c7b-kfb67" event={"ID":"fd0c36cc-5309-4e57-a9fd-1aecd344b833","Type":"ContainerStarted","Data":"54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed"} Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.404910 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.435155 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d4866c7b-kfb67" podStartSLOduration=3.435140053 podStartE2EDuration="3.435140053s" podCreationTimestamp="2025-10-10 06:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:41.43334294 +0000 UTC m=+1168.528501136" watchObservedRunningTime="2025-10-10 06:43:41.435140053 +0000 UTC m=+1168.530298249" Oct 10 06:43:41 crc kubenswrapper[4822]: I1010 06:43:41.664773 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e364a315-3983-4e5c-8dae-20c45b6261f6" path="/var/lib/kubelet/pods/e364a315-3983-4e5c-8dae-20c45b6261f6/volumes" Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.427771 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerStarted","Data":"f8baf0842acd07291e0bfa649f81cdf8a3eb732ce2a9b364c315a6052e23c0e2"} Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.429790 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" event={"ID":"daf88505-dfad-4284-b11d-317a10774ad5","Type":"ContainerStarted","Data":"a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032"} Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.429917 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.437692 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b5444654f-5wp86" event={"ID":"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69","Type":"ContainerStarted","Data":"fda58ea5f9b88b81e2eae62a1675670199fddb3e9d024a70d2a8d75abe7fbe9f"} Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.437732 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b5444654f-5wp86" event={"ID":"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69","Type":"ContainerStarted","Data":"a1174022bfa90fdbbc7bdb6448ded60ea8ea9a3effbb5e0c5631a9b1f7bfe1e1"} Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.439814 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" event={"ID":"3d602476-cde4-435f-93bc-a72c137d1b58","Type":"ContainerStarted","Data":"1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1"} Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.439855 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" event={"ID":"3d602476-cde4-435f-93bc-a72c137d1b58","Type":"ContainerStarted","Data":"72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a"} Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.476086 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" podStartSLOduration=4.476070908 podStartE2EDuration="4.476070908s" podCreationTimestamp="2025-10-10 06:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:42.469106214 +0000 UTC m=+1169.564264410" watchObservedRunningTime="2025-10-10 06:43:42.476070908 +0000 UTC m=+1169.571229104" Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.498030 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b5444654f-5wp86" podStartSLOduration=2.799134224 podStartE2EDuration="5.498011001s" podCreationTimestamp="2025-10-10 06:43:37 +0000 UTC" firstStartedPulling="2025-10-10 06:43:38.441885899 +0000 UTC m=+1165.537044095" lastFinishedPulling="2025-10-10 06:43:41.140762676 +0000 UTC m=+1168.235920872" observedRunningTime="2025-10-10 06:43:42.495014023 +0000 UTC m=+1169.590172219" watchObservedRunningTime="2025-10-10 06:43:42.498011001 +0000 UTC m=+1169.593169197" Oct 10 06:43:42 crc kubenswrapper[4822]: I1010 06:43:42.524930 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" podStartSLOduration=2.830283816 podStartE2EDuration="5.52491184s" podCreationTimestamp="2025-10-10 06:43:37 +0000 UTC" firstStartedPulling="2025-10-10 06:43:38.446839033 +0000 UTC m=+1165.541997229" lastFinishedPulling="2025-10-10 06:43:41.141467057 +0000 UTC m=+1168.236625253" observedRunningTime="2025-10-10 06:43:42.52458085 +0000 UTC m=+1169.619739046" watchObservedRunningTime="2025-10-10 06:43:42.52491184 +0000 UTC m=+1169.620070036" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.046757 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159456 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-combined-ca-bundle\") pod \"41b14558-f019-4f51-a3ab-b5689de6336a\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159542 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-scripts\") pod \"41b14558-f019-4f51-a3ab-b5689de6336a\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159640 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmngr\" (UniqueName: \"kubernetes.io/projected/41b14558-f019-4f51-a3ab-b5689de6336a-kube-api-access-wmngr\") pod \"41b14558-f019-4f51-a3ab-b5689de6336a\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159732 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-db-sync-config-data\") pod \"41b14558-f019-4f51-a3ab-b5689de6336a\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159782 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41b14558-f019-4f51-a3ab-b5689de6336a-etc-machine-id\") pod \"41b14558-f019-4f51-a3ab-b5689de6336a\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159938 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-config-data\") pod \"41b14558-f019-4f51-a3ab-b5689de6336a\" (UID: \"41b14558-f019-4f51-a3ab-b5689de6336a\") " Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.159935 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41b14558-f019-4f51-a3ab-b5689de6336a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "41b14558-f019-4f51-a3ab-b5689de6336a" (UID: "41b14558-f019-4f51-a3ab-b5689de6336a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.160410 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41b14558-f019-4f51-a3ab-b5689de6336a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.167355 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "41b14558-f019-4f51-a3ab-b5689de6336a" (UID: "41b14558-f019-4f51-a3ab-b5689de6336a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.167432 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b14558-f019-4f51-a3ab-b5689de6336a-kube-api-access-wmngr" (OuterVolumeSpecName: "kube-api-access-wmngr") pod "41b14558-f019-4f51-a3ab-b5689de6336a" (UID: "41b14558-f019-4f51-a3ab-b5689de6336a"). InnerVolumeSpecName "kube-api-access-wmngr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.174146 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-scripts" (OuterVolumeSpecName: "scripts") pod "41b14558-f019-4f51-a3ab-b5689de6336a" (UID: "41b14558-f019-4f51-a3ab-b5689de6336a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.200407 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41b14558-f019-4f51-a3ab-b5689de6336a" (UID: "41b14558-f019-4f51-a3ab-b5689de6336a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.232901 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-config-data" (OuterVolumeSpecName: "config-data") pod "41b14558-f019-4f51-a3ab-b5689de6336a" (UID: "41b14558-f019-4f51-a3ab-b5689de6336a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.261877 4822 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.261915 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.261925 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.261934 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b14558-f019-4f51-a3ab-b5689de6336a-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.261943 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmngr\" (UniqueName: \"kubernetes.io/projected/41b14558-f019-4f51-a3ab-b5689de6336a-kube-api-access-wmngr\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.325645 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c7474d4d9-hl56q"] Oct 10 06:43:43 crc kubenswrapper[4822]: E1010 06:43:43.326021 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b14558-f019-4f51-a3ab-b5689de6336a" containerName="cinder-db-sync" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.326038 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b14558-f019-4f51-a3ab-b5689de6336a" containerName="cinder-db-sync" Oct 10 06:43:43 crc kubenswrapper[4822]: E1010 06:43:43.326054 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364a315-3983-4e5c-8dae-20c45b6261f6" containerName="init" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.326060 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364a315-3983-4e5c-8dae-20c45b6261f6" containerName="init" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.326237 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b14558-f019-4f51-a3ab-b5689de6336a" containerName="cinder-db-sync" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.326250 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e364a315-3983-4e5c-8dae-20c45b6261f6" containerName="init" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.327137 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.339788 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c7474d4d9-hl56q"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.340998 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.341478 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.452041 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerStarted","Data":"08edf3d006337c1295bc66bb92a7f005b124ccaf1a0099173d6796735225fee7"} Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.453046 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.456713 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t5dtz" event={"ID":"41b14558-f019-4f51-a3ab-b5689de6336a","Type":"ContainerDied","Data":"3becba294ebd3a5d27f9b5f101e2b78bd5911f0762ae9c88b947612f4d20a9c9"} Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.456766 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3becba294ebd3a5d27f9b5f101e2b78bd5911f0762ae9c88b947612f4d20a9c9" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.456984 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t5dtz" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.464794 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-httpd-config\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.464960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-internal-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.465074 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-config\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.465128 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-ovndb-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.465227 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-combined-ca-bundle\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.465300 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mb2n\" (UniqueName: \"kubernetes.io/projected/7a076d47-5de3-4eba-a933-265448eb8a11-kube-api-access-7mb2n\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.465342 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-public-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.482082 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.959335378 podStartE2EDuration="6.4820576s" podCreationTimestamp="2025-10-10 06:43:37 +0000 UTC" firstStartedPulling="2025-10-10 06:43:38.288257701 +0000 UTC m=+1165.383415897" lastFinishedPulling="2025-10-10 06:43:42.810979923 +0000 UTC m=+1169.906138119" observedRunningTime="2025-10-10 06:43:43.480815334 +0000 UTC m=+1170.575973530" watchObservedRunningTime="2025-10-10 06:43:43.4820576 +0000 UTC m=+1170.577215816" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.567117 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-httpd-config\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.568658 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-internal-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.569151 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-config\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.569197 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-ovndb-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.569337 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-combined-ca-bundle\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.569469 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mb2n\" (UniqueName: \"kubernetes.io/projected/7a076d47-5de3-4eba-a933-265448eb8a11-kube-api-access-7mb2n\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.569503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-public-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.573446 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-config\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.575466 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-internal-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.575545 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-public-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.575880 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-ovndb-tls-certs\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.576180 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-httpd-config\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.576438 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-combined-ca-bundle\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.600394 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mb2n\" (UniqueName: \"kubernetes.io/projected/7a076d47-5de3-4eba-a933-265448eb8a11-kube-api-access-7mb2n\") pod \"neutron-6c7474d4d9-hl56q\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.652733 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.680440 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.681914 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.693451 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mjpm2" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.693817 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.693929 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.694074 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.723861 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.773399 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.773511 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/782077ed-9c40-41b6-bdc5-29ab9c18d49a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.773533 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsq78\" (UniqueName: \"kubernetes.io/projected/782077ed-9c40-41b6-bdc5-29ab9c18d49a-kube-api-access-qsq78\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.773556 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.773616 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.773644 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-scripts\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.780736 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vqkh9"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.810305 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9lj64"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.816999 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.819246 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9lj64"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.878143 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.878222 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-scripts\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.878323 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.878433 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/782077ed-9c40-41b6-bdc5-29ab9c18d49a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.878460 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsq78\" (UniqueName: \"kubernetes.io/projected/782077ed-9c40-41b6-bdc5-29ab9c18d49a-kube-api-access-qsq78\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.878501 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.883735 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/782077ed-9c40-41b6-bdc5-29ab9c18d49a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.887011 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.916863 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.917400 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.921246 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-scripts\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.944782 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsq78\" (UniqueName: \"kubernetes.io/projected/782077ed-9c40-41b6-bdc5-29ab9c18d49a-kube-api-access-qsq78\") pod \"cinder-scheduler-0\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.957971 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.974173 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.990242 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.997507 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqmwb\" (UniqueName: \"kubernetes.io/projected/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-kube-api-access-rqmwb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.999782 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:43 crc kubenswrapper[4822]: I1010 06:43:43.999878 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.000006 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-config\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.000084 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.000126 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.012042 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.026338 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106657 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106697 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106723 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106759 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-logs\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106781 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqmwb\" (UniqueName: \"kubernetes.io/projected/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-kube-api-access-rqmwb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106820 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49d4g\" (UniqueName: \"kubernetes.io/projected/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-kube-api-access-49d4g\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106838 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106864 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106916 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106940 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-config\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-scripts\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.106985 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data-custom\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.107007 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.108230 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.108884 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.109696 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.110219 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.110744 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-config\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.162859 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqmwb\" (UniqueName: \"kubernetes.io/projected/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-kube-api-access-rqmwb\") pod \"dnsmasq-dns-5c9776ccc5-9lj64\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.210832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-scripts\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.210879 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data-custom\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.210917 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.211006 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.211059 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-logs\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.211117 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49d4g\" (UniqueName: \"kubernetes.io/projected/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-kube-api-access-49d4g\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.211237 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.211627 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-logs\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.212051 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.219900 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.222378 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data-custom\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.222551 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.222675 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-scripts\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.230402 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49d4g\" (UniqueName: \"kubernetes.io/projected/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-kube-api-access-49d4g\") pod \"cinder-api-0\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.436885 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.451250 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.473393 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" podUID="daf88505-dfad-4284-b11d-317a10774ad5" containerName="dnsmasq-dns" containerID="cri-o://a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032" gracePeriod=10 Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.579243 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:44 crc kubenswrapper[4822]: I1010 06:43:44.653794 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c7474d4d9-hl56q"] Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.109212 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9lj64"] Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.288735 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.304459 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.351955 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-sb\") pod \"daf88505-dfad-4284-b11d-317a10774ad5\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.352142 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-swift-storage-0\") pod \"daf88505-dfad-4284-b11d-317a10774ad5\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.352303 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-svc\") pod \"daf88505-dfad-4284-b11d-317a10774ad5\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.352428 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-config\") pod \"daf88505-dfad-4284-b11d-317a10774ad5\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.352749 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbrm2\" (UniqueName: \"kubernetes.io/projected/daf88505-dfad-4284-b11d-317a10774ad5-kube-api-access-fbrm2\") pod \"daf88505-dfad-4284-b11d-317a10774ad5\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.353137 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-nb\") pod \"daf88505-dfad-4284-b11d-317a10774ad5\" (UID: \"daf88505-dfad-4284-b11d-317a10774ad5\") " Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.363094 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf88505-dfad-4284-b11d-317a10774ad5-kube-api-access-fbrm2" (OuterVolumeSpecName: "kube-api-access-fbrm2") pod "daf88505-dfad-4284-b11d-317a10774ad5" (UID: "daf88505-dfad-4284-b11d-317a10774ad5"). InnerVolumeSpecName "kube-api-access-fbrm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.421469 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "daf88505-dfad-4284-b11d-317a10774ad5" (UID: "daf88505-dfad-4284-b11d-317a10774ad5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.457497 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbrm2\" (UniqueName: \"kubernetes.io/projected/daf88505-dfad-4284-b11d-317a10774ad5-kube-api-access-fbrm2\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.457528 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.460940 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "daf88505-dfad-4284-b11d-317a10774ad5" (UID: "daf88505-dfad-4284-b11d-317a10774ad5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.464610 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "daf88505-dfad-4284-b11d-317a10774ad5" (UID: "daf88505-dfad-4284-b11d-317a10774ad5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.465074 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "daf88505-dfad-4284-b11d-317a10774ad5" (UID: "daf88505-dfad-4284-b11d-317a10774ad5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.488914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7474d4d9-hl56q" event={"ID":"7a076d47-5de3-4eba-a933-265448eb8a11","Type":"ContainerStarted","Data":"1dd0282c32b952bc28426c99bc939d6cd5556dd9df6f57d786cbccb734a3f9db"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.488961 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7474d4d9-hl56q" event={"ID":"7a076d47-5de3-4eba-a933-265448eb8a11","Type":"ContainerStarted","Data":"9746516ae03a9d0431e081db90000cdd9ba8b028ceb5e12c38a4ccdc2127be36"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.491143 4822 generic.go:334] "Generic (PLEG): container finished" podID="daf88505-dfad-4284-b11d-317a10774ad5" containerID="a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032" exitCode=0 Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.491244 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" event={"ID":"daf88505-dfad-4284-b11d-317a10774ad5","Type":"ContainerDied","Data":"a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.491278 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" event={"ID":"daf88505-dfad-4284-b11d-317a10774ad5","Type":"ContainerDied","Data":"eee06d1ad7e3048bbf610945043c83b1ecc02561883d7d7ef71ffb0641baa2c6"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.491300 4822 scope.go:117] "RemoveContainer" containerID="a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.491502 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vqkh9" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.498129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7","Type":"ContainerStarted","Data":"6ac786dce6b7d1d4434f21eb5796f23dcb6b2082c45391d34e02c0e06324ba4f"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.509227 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" event={"ID":"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5","Type":"ContainerStarted","Data":"a60bf2a6a25865761ebf409822e2df24282448121ae7ffeabe9b4e4bef3943e3"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.511263 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"782077ed-9c40-41b6-bdc5-29ab9c18d49a","Type":"ContainerStarted","Data":"74e51d7317976538fcdd12c8916a36969be8dd717bde28005af22625dfdf21a8"} Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.519624 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-config" (OuterVolumeSpecName: "config") pod "daf88505-dfad-4284-b11d-317a10774ad5" (UID: "daf88505-dfad-4284-b11d-317a10774ad5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.525955 4822 scope.go:117] "RemoveContainer" containerID="8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.559311 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.559340 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.559350 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.559358 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daf88505-dfad-4284-b11d-317a10774ad5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.601292 4822 scope.go:117] "RemoveContainer" containerID="a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032" Oct 10 06:43:45 crc kubenswrapper[4822]: E1010 06:43:45.603948 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032\": container with ID starting with a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032 not found: ID does not exist" containerID="a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.604007 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032"} err="failed to get container status \"a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032\": rpc error: code = NotFound desc = could not find container \"a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032\": container with ID starting with a4c4c83df2b892b6084c46be2e8b1eb4bc46e649e30c9827461603ed6a0bd032 not found: ID does not exist" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.604037 4822 scope.go:117] "RemoveContainer" containerID="8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154" Oct 10 06:43:45 crc kubenswrapper[4822]: E1010 06:43:45.605667 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154\": container with ID starting with 8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154 not found: ID does not exist" containerID="8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.605698 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154"} err="failed to get container status \"8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154\": rpc error: code = NotFound desc = could not find container \"8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154\": container with ID starting with 8f52af088a0e22c4df84bd3ffc8b9d0792e1c605f81a79e0916420217550c154 not found: ID does not exist" Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.830294 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vqkh9"] Oct 10 06:43:45 crc kubenswrapper[4822]: I1010 06:43:45.856900 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vqkh9"] Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.521258 4822 generic.go:334] "Generic (PLEG): container finished" podID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerID="8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28" exitCode=0 Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.521521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" event={"ID":"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5","Type":"ContainerDied","Data":"8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28"} Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.525107 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"782077ed-9c40-41b6-bdc5-29ab9c18d49a","Type":"ContainerStarted","Data":"efac4765292da7b3dd9b11b4de9112dd774cbf45ca52533b0cf61b23a4974bd5"} Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.527306 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7474d4d9-hl56q" event={"ID":"7a076d47-5de3-4eba-a933-265448eb8a11","Type":"ContainerStarted","Data":"a63c8d6a853e7bf676f1d100b90459d644b828358cb3a287d4e0e0565d941778"} Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.527960 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.546193 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7","Type":"ContainerStarted","Data":"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44"} Oct 10 06:43:46 crc kubenswrapper[4822]: I1010 06:43:46.572457 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c7474d4d9-hl56q" podStartSLOduration=3.572410815 podStartE2EDuration="3.572410815s" podCreationTimestamp="2025-10-10 06:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:46.558496908 +0000 UTC m=+1173.653655114" watchObservedRunningTime="2025-10-10 06:43:46.572410815 +0000 UTC m=+1173.667569021" Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.556171 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" event={"ID":"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5","Type":"ContainerStarted","Data":"d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b"} Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.556559 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.559324 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"782077ed-9c40-41b6-bdc5-29ab9c18d49a","Type":"ContainerStarted","Data":"46b00e1f9a3f1b9f8f88500e87a20f6e02773078b628ee962768683f5258b98a"} Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.561064 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7","Type":"ContainerStarted","Data":"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb"} Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.581464 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" podStartSLOduration=4.581447687 podStartE2EDuration="4.581447687s" podCreationTimestamp="2025-10-10 06:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:47.576341707 +0000 UTC m=+1174.671499913" watchObservedRunningTime="2025-10-10 06:43:47.581447687 +0000 UTC m=+1174.676605883" Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.595697 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.751687591 podStartE2EDuration="4.595680104s" podCreationTimestamp="2025-10-10 06:43:43 +0000 UTC" firstStartedPulling="2025-10-10 06:43:44.630458835 +0000 UTC m=+1171.725617031" lastFinishedPulling="2025-10-10 06:43:45.474451348 +0000 UTC m=+1172.569609544" observedRunningTime="2025-10-10 06:43:47.593437388 +0000 UTC m=+1174.688595604" watchObservedRunningTime="2025-10-10 06:43:47.595680104 +0000 UTC m=+1174.690838310" Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.617789 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.617774261 podStartE2EDuration="4.617774261s" podCreationTimestamp="2025-10-10 06:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:47.614694291 +0000 UTC m=+1174.709852487" watchObservedRunningTime="2025-10-10 06:43:47.617774261 +0000 UTC m=+1174.712932457" Oct 10 06:43:47 crc kubenswrapper[4822]: I1010 06:43:47.667871 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daf88505-dfad-4284-b11d-317a10774ad5" path="/var/lib/kubelet/pods/daf88505-dfad-4284-b11d-317a10774ad5/volumes" Oct 10 06:43:48 crc kubenswrapper[4822]: I1010 06:43:48.437518 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:48 crc kubenswrapper[4822]: I1010 06:43:48.570133 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.013722 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.053140 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c8fcfdd4-4gk9s"] Oct 10 06:43:49 crc kubenswrapper[4822]: E1010 06:43:49.053511 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf88505-dfad-4284-b11d-317a10774ad5" containerName="init" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.053530 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf88505-dfad-4284-b11d-317a10774ad5" containerName="init" Oct 10 06:43:49 crc kubenswrapper[4822]: E1010 06:43:49.053560 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf88505-dfad-4284-b11d-317a10774ad5" containerName="dnsmasq-dns" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.053566 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf88505-dfad-4284-b11d-317a10774ad5" containerName="dnsmasq-dns" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.053750 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf88505-dfad-4284-b11d-317a10774ad5" containerName="dnsmasq-dns" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.054771 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.057927 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.058137 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.072178 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c8fcfdd4-4gk9s"] Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148168 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-combined-ca-bundle\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148307 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148364 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrv64\" (UniqueName: \"kubernetes.io/projected/35854fe5-2e29-4a49-9783-873bee1058e2-kube-api-access-zrv64\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148402 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data-custom\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148437 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35854fe5-2e29-4a49-9783-873bee1058e2-logs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148640 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-public-tls-certs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.148702 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-internal-tls-certs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.250727 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-combined-ca-bundle\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.250866 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.250902 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrv64\" (UniqueName: \"kubernetes.io/projected/35854fe5-2e29-4a49-9783-873bee1058e2-kube-api-access-zrv64\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.250925 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data-custom\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.250955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35854fe5-2e29-4a49-9783-873bee1058e2-logs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.251013 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-public-tls-certs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.251037 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-internal-tls-certs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.252181 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35854fe5-2e29-4a49-9783-873bee1058e2-logs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.260452 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-internal-tls-certs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.262225 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-public-tls-certs\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.263025 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data-custom\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.274846 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.276456 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrv64\" (UniqueName: \"kubernetes.io/projected/35854fe5-2e29-4a49-9783-873bee1058e2-kube-api-access-zrv64\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.276596 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-combined-ca-bundle\") pod \"barbican-api-5c8fcfdd4-4gk9s\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.377445 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.592215 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api-log" containerID="cri-o://d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44" gracePeriod=30 Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.592910 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api" containerID="cri-o://d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb" gracePeriod=30 Oct 10 06:43:49 crc kubenswrapper[4822]: I1010 06:43:49.952185 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c8fcfdd4-4gk9s"] Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.208968 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.376791 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-etc-machine-id\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.376880 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data-custom\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.376921 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-combined-ca-bundle\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.376942 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.377025 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.377076 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49d4g\" (UniqueName: \"kubernetes.io/projected/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-kube-api-access-49d4g\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.377280 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-logs\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.377399 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-scripts\") pod \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\" (UID: \"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7\") " Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.377898 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.380488 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-logs" (OuterVolumeSpecName: "logs") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.381423 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-scripts" (OuterVolumeSpecName: "scripts") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.383917 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.386076 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-kube-api-access-49d4g" (OuterVolumeSpecName: "kube-api-access-49d4g") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "kube-api-access-49d4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.419973 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.479884 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.479919 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.479929 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49d4g\" (UniqueName: \"kubernetes.io/projected/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-kube-api-access-49d4g\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.479940 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.479948 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.520964 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data" (OuterVolumeSpecName: "config-data") pod "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" (UID: "2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.592289 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.650852 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" event={"ID":"35854fe5-2e29-4a49-9783-873bee1058e2","Type":"ContainerStarted","Data":"489b35ea36a581d0ca50df223a61ef6883583aa6d15472a109b5513aa80e1f47"} Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.651211 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" event={"ID":"35854fe5-2e29-4a49-9783-873bee1058e2","Type":"ContainerStarted","Data":"d4a9bafe329a5dd29564e56dc6c12b0bb627050c8e4b71bc14df993ef884ad8a"} Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673092 4822 generic.go:334] "Generic (PLEG): container finished" podID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerID="d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb" exitCode=0 Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673322 4822 generic.go:334] "Generic (PLEG): container finished" podID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerID="d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44" exitCode=143 Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673400 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7","Type":"ContainerDied","Data":"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb"} Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673472 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7","Type":"ContainerDied","Data":"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44"} Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673535 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7","Type":"ContainerDied","Data":"6ac786dce6b7d1d4434f21eb5796f23dcb6b2082c45391d34e02c0e06324ba4f"} Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673606 4822 scope.go:117] "RemoveContainer" containerID="d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.673829 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.717279 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.724993 4822 scope.go:117] "RemoveContainer" containerID="d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.744370 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.760719 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:50 crc kubenswrapper[4822]: E1010 06:43:50.761286 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.761361 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api" Oct 10 06:43:50 crc kubenswrapper[4822]: E1010 06:43:50.761640 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api-log" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.761723 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api-log" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.761987 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api-log" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.762081 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" containerName="cinder-api" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.769982 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.771465 4822 scope.go:117] "RemoveContainer" containerID="d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb" Oct 10 06:43:50 crc kubenswrapper[4822]: E1010 06:43:50.773399 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb\": container with ID starting with d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb not found: ID does not exist" containerID="d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.773534 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb"} err="failed to get container status \"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb\": rpc error: code = NotFound desc = could not find container \"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb\": container with ID starting with d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb not found: ID does not exist" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.773621 4822 scope.go:117] "RemoveContainer" containerID="d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44" Oct 10 06:43:50 crc kubenswrapper[4822]: E1010 06:43:50.774238 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44\": container with ID starting with d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44 not found: ID does not exist" containerID="d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.774342 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44"} err="failed to get container status \"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44\": rpc error: code = NotFound desc = could not find container \"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44\": container with ID starting with d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44 not found: ID does not exist" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.774407 4822 scope.go:117] "RemoveContainer" containerID="d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.774731 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb"} err="failed to get container status \"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb\": rpc error: code = NotFound desc = could not find container \"d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb\": container with ID starting with d2095fe51d83808e89a771b43fd5c63d3c2d5bca3602f04a5cfce13864dc1eeb not found: ID does not exist" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.774933 4822 scope.go:117] "RemoveContainer" containerID="d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.775187 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44"} err="failed to get container status \"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44\": rpc error: code = NotFound desc = could not find container \"d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44\": container with ID starting with d6bf13bb577aeb1edf885d9657779699e8eb5b8632638b90221c49e0b5786a44 not found: ID does not exist" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.780312 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.785401 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.790264 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.795933 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.883095 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899020 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data-custom\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899059 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899079 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0da7840-eaa9-46a7-bda6-5de928993572-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899246 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899488 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899547 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0da7840-eaa9-46a7-bda6-5de928993572-logs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899613 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899642 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44f6n\" (UniqueName: \"kubernetes.io/projected/f0da7840-eaa9-46a7-bda6-5de928993572-kube-api-access-44f6n\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.899721 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-scripts\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:50 crc kubenswrapper[4822]: I1010 06:43:50.930879 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001475 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data-custom\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001512 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001529 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0da7840-eaa9-46a7-bda6-5de928993572-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001548 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001588 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001607 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0da7840-eaa9-46a7-bda6-5de928993572-logs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001628 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001642 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44f6n\" (UniqueName: \"kubernetes.io/projected/f0da7840-eaa9-46a7-bda6-5de928993572-kube-api-access-44f6n\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.001672 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-scripts\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.002349 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0da7840-eaa9-46a7-bda6-5de928993572-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.002896 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0da7840-eaa9-46a7-bda6-5de928993572-logs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.007577 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data-custom\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.009813 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.016508 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.017089 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.017405 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.019207 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-scripts\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.019285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44f6n\" (UniqueName: \"kubernetes.io/projected/f0da7840-eaa9-46a7-bda6-5de928993572-kube-api-access-44f6n\") pod \"cinder-api-0\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.089276 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.665536 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7" path="/var/lib/kubelet/pods/2f1ed51b-1b80-40ed-ac4a-666ecfbb63e7/volumes" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.667007 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:43:51 crc kubenswrapper[4822]: W1010 06:43:51.683292 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0da7840_eaa9_46a7_bda6_5de928993572.slice/crio-680c541b9a648baf6e66f4880e6462444f29a88894a1aeefcc10d2e4d497240b WatchSource:0}: Error finding container 680c541b9a648baf6e66f4880e6462444f29a88894a1aeefcc10d2e4d497240b: Status 404 returned error can't find the container with id 680c541b9a648baf6e66f4880e6462444f29a88894a1aeefcc10d2e4d497240b Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.689432 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" event={"ID":"35854fe5-2e29-4a49-9783-873bee1058e2","Type":"ContainerStarted","Data":"d17675142c2329604a137158115089b304e6ce04d8ec7938439c69f02db6cd14"} Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.689751 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:51 crc kubenswrapper[4822]: I1010 06:43:51.717285 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" podStartSLOduration=2.717263841 podStartE2EDuration="2.717263841s" podCreationTimestamp="2025-10-10 06:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:51.709253946 +0000 UTC m=+1178.804412162" watchObservedRunningTime="2025-10-10 06:43:51.717263841 +0000 UTC m=+1178.812422047" Oct 10 06:43:52 crc kubenswrapper[4822]: I1010 06:43:52.719068 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0da7840-eaa9-46a7-bda6-5de928993572","Type":"ContainerStarted","Data":"f83cde6f4a70de9b355f7b282554b988566370270eca0205f68d4daaaf187345"} Oct 10 06:43:52 crc kubenswrapper[4822]: I1010 06:43:52.719490 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:43:52 crc kubenswrapper[4822]: I1010 06:43:52.719514 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0da7840-eaa9-46a7-bda6-5de928993572","Type":"ContainerStarted","Data":"680c541b9a648baf6e66f4880e6462444f29a88894a1aeefcc10d2e4d497240b"} Oct 10 06:43:53 crc kubenswrapper[4822]: I1010 06:43:53.729221 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0da7840-eaa9-46a7-bda6-5de928993572","Type":"ContainerStarted","Data":"5768417b026ddc15a2a8c2d6da91a0f1f8ec8b7c89708cc85cf21fe86db42db9"} Oct 10 06:43:53 crc kubenswrapper[4822]: I1010 06:43:53.763248 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.763224819 podStartE2EDuration="3.763224819s" podCreationTimestamp="2025-10-10 06:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:43:53.753617217 +0000 UTC m=+1180.848775423" watchObservedRunningTime="2025-10-10 06:43:53.763224819 +0000 UTC m=+1180.858383015" Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.214484 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.296439 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.440028 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.502332 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2twgn"] Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.505123 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerName="dnsmasq-dns" containerID="cri-o://b6c57ce6bc8804ab9c4e266d92708e442dcf3712d835baf3e1ad4258627ebf92" gracePeriod=10 Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.747264 4822 generic.go:334] "Generic (PLEG): container finished" podID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerID="b6c57ce6bc8804ab9c4e266d92708e442dcf3712d835baf3e1ad4258627ebf92" exitCode=0 Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.747521 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="cinder-scheduler" containerID="cri-o://efac4765292da7b3dd9b11b4de9112dd774cbf45ca52533b0cf61b23a4974bd5" gracePeriod=30 Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.747633 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="probe" containerID="cri-o://46b00e1f9a3f1b9f8f88500e87a20f6e02773078b628ee962768683f5258b98a" gracePeriod=30 Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.747781 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" event={"ID":"56ec6334-97d6-4fa8-8f14-ce44ab82aa15","Type":"ContainerDied","Data":"b6c57ce6bc8804ab9c4e266d92708e442dcf3712d835baf3e1ad4258627ebf92"} Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.748075 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 06:43:54 crc kubenswrapper[4822]: I1010 06:43:54.976615 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.106192 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-svc\") pod \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.106327 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-nb\") pod \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.106502 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7hvf\" (UniqueName: \"kubernetes.io/projected/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-kube-api-access-q7hvf\") pod \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.106539 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-config\") pod \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.106579 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-swift-storage-0\") pod \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.106620 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-sb\") pod \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\" (UID: \"56ec6334-97d6-4fa8-8f14-ce44ab82aa15\") " Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.111588 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-kube-api-access-q7hvf" (OuterVolumeSpecName: "kube-api-access-q7hvf") pod "56ec6334-97d6-4fa8-8f14-ce44ab82aa15" (UID: "56ec6334-97d6-4fa8-8f14-ce44ab82aa15"). InnerVolumeSpecName "kube-api-access-q7hvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.155156 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56ec6334-97d6-4fa8-8f14-ce44ab82aa15" (UID: "56ec6334-97d6-4fa8-8f14-ce44ab82aa15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.157636 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56ec6334-97d6-4fa8-8f14-ce44ab82aa15" (UID: "56ec6334-97d6-4fa8-8f14-ce44ab82aa15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.158773 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56ec6334-97d6-4fa8-8f14-ce44ab82aa15" (UID: "56ec6334-97d6-4fa8-8f14-ce44ab82aa15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.160023 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-config" (OuterVolumeSpecName: "config") pod "56ec6334-97d6-4fa8-8f14-ce44ab82aa15" (UID: "56ec6334-97d6-4fa8-8f14-ce44ab82aa15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.160847 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56ec6334-97d6-4fa8-8f14-ce44ab82aa15" (UID: "56ec6334-97d6-4fa8-8f14-ce44ab82aa15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.208582 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7hvf\" (UniqueName: \"kubernetes.io/projected/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-kube-api-access-q7hvf\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.208612 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.208621 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.208630 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.208638 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.208645 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ec6334-97d6-4fa8-8f14-ce44ab82aa15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.759660 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.759665 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2twgn" event={"ID":"56ec6334-97d6-4fa8-8f14-ce44ab82aa15","Type":"ContainerDied","Data":"c98624a7c1c42d6180bc26e56111fbc01bf472396c68be1aa9951cd140730b1a"} Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.759729 4822 scope.go:117] "RemoveContainer" containerID="b6c57ce6bc8804ab9c4e266d92708e442dcf3712d835baf3e1ad4258627ebf92" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.763083 4822 generic.go:334] "Generic (PLEG): container finished" podID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerID="46b00e1f9a3f1b9f8f88500e87a20f6e02773078b628ee962768683f5258b98a" exitCode=0 Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.763164 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"782077ed-9c40-41b6-bdc5-29ab9c18d49a","Type":"ContainerDied","Data":"46b00e1f9a3f1b9f8f88500e87a20f6e02773078b628ee962768683f5258b98a"} Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.783855 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2twgn"] Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.786001 4822 scope.go:117] "RemoveContainer" containerID="49a62a065ff824c72d62c05f8af2fd8d6f353aec586bfd2f8c9eb655880e1638" Oct 10 06:43:55 crc kubenswrapper[4822]: I1010 06:43:55.791613 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2twgn"] Oct 10 06:43:57 crc kubenswrapper[4822]: I1010 06:43:57.660616 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" path="/var/lib/kubelet/pods/56ec6334-97d6-4fa8-8f14-ce44ab82aa15/volumes" Oct 10 06:43:57 crc kubenswrapper[4822]: I1010 06:43:57.809137 4822 generic.go:334] "Generic (PLEG): container finished" podID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerID="efac4765292da7b3dd9b11b4de9112dd774cbf45ca52533b0cf61b23a4974bd5" exitCode=0 Oct 10 06:43:57 crc kubenswrapper[4822]: I1010 06:43:57.809176 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"782077ed-9c40-41b6-bdc5-29ab9c18d49a","Type":"ContainerDied","Data":"efac4765292da7b3dd9b11b4de9112dd774cbf45ca52533b0cf61b23a4974bd5"} Oct 10 06:43:57 crc kubenswrapper[4822]: I1010 06:43:57.992376 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.161190 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data-custom\") pod \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.161855 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data\") pod \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.162045 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-scripts\") pod \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.162227 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsq78\" (UniqueName: \"kubernetes.io/projected/782077ed-9c40-41b6-bdc5-29ab9c18d49a-kube-api-access-qsq78\") pod \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.162324 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-combined-ca-bundle\") pod \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.162372 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/782077ed-9c40-41b6-bdc5-29ab9c18d49a-etc-machine-id\") pod \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\" (UID: \"782077ed-9c40-41b6-bdc5-29ab9c18d49a\") " Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.162445 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/782077ed-9c40-41b6-bdc5-29ab9c18d49a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "782077ed-9c40-41b6-bdc5-29ab9c18d49a" (UID: "782077ed-9c40-41b6-bdc5-29ab9c18d49a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.163170 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/782077ed-9c40-41b6-bdc5-29ab9c18d49a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.168999 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "782077ed-9c40-41b6-bdc5-29ab9c18d49a" (UID: "782077ed-9c40-41b6-bdc5-29ab9c18d49a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.183901 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782077ed-9c40-41b6-bdc5-29ab9c18d49a-kube-api-access-qsq78" (OuterVolumeSpecName: "kube-api-access-qsq78") pod "782077ed-9c40-41b6-bdc5-29ab9c18d49a" (UID: "782077ed-9c40-41b6-bdc5-29ab9c18d49a"). InnerVolumeSpecName "kube-api-access-qsq78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.187034 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-scripts" (OuterVolumeSpecName: "scripts") pod "782077ed-9c40-41b6-bdc5-29ab9c18d49a" (UID: "782077ed-9c40-41b6-bdc5-29ab9c18d49a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.250652 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "782077ed-9c40-41b6-bdc5-29ab9c18d49a" (UID: "782077ed-9c40-41b6-bdc5-29ab9c18d49a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.265060 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.265103 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsq78\" (UniqueName: \"kubernetes.io/projected/782077ed-9c40-41b6-bdc5-29ab9c18d49a-kube-api-access-qsq78\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.265118 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.265130 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.291647 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data" (OuterVolumeSpecName: "config-data") pod "782077ed-9c40-41b6-bdc5-29ab9c18d49a" (UID: "782077ed-9c40-41b6-bdc5-29ab9c18d49a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.366650 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782077ed-9c40-41b6-bdc5-29ab9c18d49a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.819414 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"782077ed-9c40-41b6-bdc5-29ab9c18d49a","Type":"ContainerDied","Data":"74e51d7317976538fcdd12c8916a36969be8dd717bde28005af22625dfdf21a8"} Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.819470 4822 scope.go:117] "RemoveContainer" containerID="46b00e1f9a3f1b9f8f88500e87a20f6e02773078b628ee962768683f5258b98a" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.820575 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.845410 4822 scope.go:117] "RemoveContainer" containerID="efac4765292da7b3dd9b11b4de9112dd774cbf45ca52533b0cf61b23a4974bd5" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.857596 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.921748 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.938671 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:58 crc kubenswrapper[4822]: E1010 06:43:58.939064 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerName="dnsmasq-dns" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939082 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerName="dnsmasq-dns" Oct 10 06:43:58 crc kubenswrapper[4822]: E1010 06:43:58.939114 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="probe" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939120 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="probe" Oct 10 06:43:58 crc kubenswrapper[4822]: E1010 06:43:58.939131 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="cinder-scheduler" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939137 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="cinder-scheduler" Oct 10 06:43:58 crc kubenswrapper[4822]: E1010 06:43:58.939162 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerName="init" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939168 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerName="init" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939331 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ec6334-97d6-4fa8-8f14-ce44ab82aa15" containerName="dnsmasq-dns" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939347 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="cinder-scheduler" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.939370 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" containerName="probe" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.940377 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.943647 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.948537 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.995596 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vksr\" (UniqueName: \"kubernetes.io/projected/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-kube-api-access-8vksr\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.995648 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.995668 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.995867 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.996023 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:58 crc kubenswrapper[4822]: I1010 06:43:58.996111 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.097644 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.097976 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.098030 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.098075 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.098371 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.098569 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.098661 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vksr\" (UniqueName: \"kubernetes.io/projected/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-kube-api-access-8vksr\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.102812 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.102930 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.103641 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.103895 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.116596 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vksr\" (UniqueName: \"kubernetes.io/projected/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-kube-api-access-8vksr\") pod \"cinder-scheduler-0\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.256782 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.593010 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.645934 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.661866 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782077ed-9c40-41b6-bdc5-29ab9c18d49a" path="/var/lib/kubelet/pods/782077ed-9c40-41b6-bdc5-29ab9c18d49a/volumes" Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.745215 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:43:59 crc kubenswrapper[4822]: I1010 06:43:59.861476 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a","Type":"ContainerStarted","Data":"9a144ed9a820acef1b717593380d295971f0f0cec69bbca5b011fa52fe058d12"} Oct 10 06:44:00 crc kubenswrapper[4822]: I1010 06:44:00.211207 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:44:00 crc kubenswrapper[4822]: I1010 06:44:00.885193 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a","Type":"ContainerStarted","Data":"efce444dc287af620565305af2a58baca7c83295b3bd9c0cfb54af7d64975eef"} Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.193329 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.258064 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.315084 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f84f988dd-n6ss7"] Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.315415 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f84f988dd-n6ss7" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api-log" containerID="cri-o://383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085" gracePeriod=30 Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.315415 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f84f988dd-n6ss7" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api" containerID="cri-o://8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b" gracePeriod=30 Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.900441 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a","Type":"ContainerStarted","Data":"6bab53c5a6f89332c0aa8004dfd4394eade34796f5bf17f77ba2d71af6be542f"} Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.904091 4822 generic.go:334] "Generic (PLEG): container finished" podID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerID="383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085" exitCode=143 Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.904749 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f84f988dd-n6ss7" event={"ID":"4b206732-dd22-4cb9-a322-ef8ea8021341","Type":"ContainerDied","Data":"383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085"} Oct 10 06:44:01 crc kubenswrapper[4822]: I1010 06:44:01.926328 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.9263066159999997 podStartE2EDuration="3.926306616s" podCreationTimestamp="2025-10-10 06:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:44:01.923981988 +0000 UTC m=+1189.019140194" watchObservedRunningTime="2025-10-10 06:44:01.926306616 +0000 UTC m=+1189.021464822" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.347754 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.816043 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.835938 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.838143 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.839817 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.840131 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.846542 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xssff" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.897128 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.897224 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9q5z\" (UniqueName: \"kubernetes.io/projected/c8cd3778-5a5a-483a-af22-5b8420ae896b-kube-api-access-n9q5z\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.897284 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:03 crc kubenswrapper[4822]: I1010 06:44:03.897316 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.000576 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.000661 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9q5z\" (UniqueName: \"kubernetes.io/projected/c8cd3778-5a5a-483a-af22-5b8420ae896b-kube-api-access-n9q5z\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.000696 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.001438 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.002292 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.006548 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.016090 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.026747 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9q5z\" (UniqueName: \"kubernetes.io/projected/c8cd3778-5a5a-483a-af22-5b8420ae896b-kube-api-access-n9q5z\") pod \"openstackclient\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.159387 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.257990 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.485969 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f84f988dd-n6ss7" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:56530->10.217.0.158:9311: read: connection reset by peer" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.487217 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f84f988dd-n6ss7" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:56516->10.217.0.158:9311: read: connection reset by peer" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.662435 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.877292 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.935609 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data\") pod \"4b206732-dd22-4cb9-a322-ef8ea8021341\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.935720 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rfl5\" (UniqueName: \"kubernetes.io/projected/4b206732-dd22-4cb9-a322-ef8ea8021341-kube-api-access-5rfl5\") pod \"4b206732-dd22-4cb9-a322-ef8ea8021341\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.935849 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-combined-ca-bundle\") pod \"4b206732-dd22-4cb9-a322-ef8ea8021341\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.935923 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data-custom\") pod \"4b206732-dd22-4cb9-a322-ef8ea8021341\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.936009 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b206732-dd22-4cb9-a322-ef8ea8021341-logs\") pod \"4b206732-dd22-4cb9-a322-ef8ea8021341\" (UID: \"4b206732-dd22-4cb9-a322-ef8ea8021341\") " Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.937527 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b206732-dd22-4cb9-a322-ef8ea8021341-logs" (OuterVolumeSpecName: "logs") pod "4b206732-dd22-4cb9-a322-ef8ea8021341" (UID: "4b206732-dd22-4cb9-a322-ef8ea8021341"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.948559 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b206732-dd22-4cb9-a322-ef8ea8021341-kube-api-access-5rfl5" (OuterVolumeSpecName: "kube-api-access-5rfl5") pod "4b206732-dd22-4cb9-a322-ef8ea8021341" (UID: "4b206732-dd22-4cb9-a322-ef8ea8021341"). InnerVolumeSpecName "kube-api-access-5rfl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.951936 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b206732-dd22-4cb9-a322-ef8ea8021341" (UID: "4b206732-dd22-4cb9-a322-ef8ea8021341"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.955909 4822 generic.go:334] "Generic (PLEG): container finished" podID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerID="8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b" exitCode=0 Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.956104 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f84f988dd-n6ss7" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.956226 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f84f988dd-n6ss7" event={"ID":"4b206732-dd22-4cb9-a322-ef8ea8021341","Type":"ContainerDied","Data":"8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b"} Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.956351 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f84f988dd-n6ss7" event={"ID":"4b206732-dd22-4cb9-a322-ef8ea8021341","Type":"ContainerDied","Data":"6bda34b90f45e883198933582f0f051164d6966ba3c88d7af317e572dbd9e67a"} Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.956442 4822 scope.go:117] "RemoveContainer" containerID="8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.959469 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c8cd3778-5a5a-483a-af22-5b8420ae896b","Type":"ContainerStarted","Data":"3e9893713638b381361cba610cbe446b79f75ca6dc1df481bf512cf1aaabbc93"} Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.987309 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b206732-dd22-4cb9-a322-ef8ea8021341" (UID: "4b206732-dd22-4cb9-a322-ef8ea8021341"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:04 crc kubenswrapper[4822]: I1010 06:44:04.987368 4822 scope.go:117] "RemoveContainer" containerID="383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.018974 4822 scope.go:117] "RemoveContainer" containerID="8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.019836 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data" (OuterVolumeSpecName: "config-data") pod "4b206732-dd22-4cb9-a322-ef8ea8021341" (UID: "4b206732-dd22-4cb9-a322-ef8ea8021341"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:05 crc kubenswrapper[4822]: E1010 06:44:05.020243 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b\": container with ID starting with 8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b not found: ID does not exist" containerID="8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.020277 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b"} err="failed to get container status \"8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b\": rpc error: code = NotFound desc = could not find container \"8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b\": container with ID starting with 8383a20ffd0a3b0ad07daf26ecccfd10b2034a065cfdf435f47826fa6805072b not found: ID does not exist" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.020302 4822 scope.go:117] "RemoveContainer" containerID="383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085" Oct 10 06:44:05 crc kubenswrapper[4822]: E1010 06:44:05.021134 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085\": container with ID starting with 383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085 not found: ID does not exist" containerID="383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.021166 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085"} err="failed to get container status \"383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085\": rpc error: code = NotFound desc = could not find container \"383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085\": container with ID starting with 383093e7bda36d5eac942aabbcbb5eebb834674765c83f5ebbcdf2279e41d085 not found: ID does not exist" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.038614 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.038995 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.039016 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b206732-dd22-4cb9-a322-ef8ea8021341-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.039027 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b206732-dd22-4cb9-a322-ef8ea8021341-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.039039 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rfl5\" (UniqueName: \"kubernetes.io/projected/4b206732-dd22-4cb9-a322-ef8ea8021341-kube-api-access-5rfl5\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.288818 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f84f988dd-n6ss7"] Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.301314 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f84f988dd-n6ss7"] Oct 10 06:44:05 crc kubenswrapper[4822]: I1010 06:44:05.667939 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" path="/var/lib/kubelet/pods/4b206732-dd22-4cb9-a322-ef8ea8021341/volumes" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.676841 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6b5485c95f-w8q56"] Oct 10 06:44:06 crc kubenswrapper[4822]: E1010 06:44:06.677527 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api-log" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.677540 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api-log" Oct 10 06:44:06 crc kubenswrapper[4822]: E1010 06:44:06.677567 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.677575 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.677749 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.677759 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b206732-dd22-4cb9-a322-ef8ea8021341" containerName="barbican-api-log" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.678696 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.684618 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.684875 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.685065 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.710185 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6b5485c95f-w8q56"] Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769483 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xffb6\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-kube-api-access-xffb6\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769555 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-etc-swift\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769597 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-log-httpd\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769630 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-run-httpd\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769667 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-internal-tls-certs\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769684 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-config-data\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769732 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-combined-ca-bundle\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.769750 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-public-tls-certs\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871133 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-run-httpd\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871188 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-internal-tls-certs\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871211 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-config-data\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871271 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-combined-ca-bundle\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871286 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-public-tls-certs\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871365 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xffb6\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-kube-api-access-xffb6\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871391 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-etc-swift\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871442 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-log-httpd\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871724 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-run-httpd\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.871894 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-log-httpd\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.876871 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-internal-tls-certs\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.877462 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-etc-swift\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.879248 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-config-data\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.879778 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-combined-ca-bundle\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.892499 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-public-tls-certs\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.898360 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xffb6\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-kube-api-access-xffb6\") pod \"swift-proxy-6b5485c95f-w8q56\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:06 crc kubenswrapper[4822]: I1010 06:44:06.997539 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.615461 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6b5485c95f-w8q56"] Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.664496 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.890135 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.890521 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-log" containerID="cri-o://f0f97deab7199ec122d10381e74550fceb86eea97acef347ad6e88fcb9754e6a" gracePeriod=30 Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.890681 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-httpd" containerID="cri-o://b4aa99e351481e0bb7dbfcc43395a335eb68b03ff6aafce0ff40b4b34ffeb8ba" gracePeriod=30 Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.990397 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b5485c95f-w8q56" event={"ID":"0df354f0-e9e8-441a-a676-8a6468b8c191","Type":"ContainerStarted","Data":"065d149d1b211170467b39093535c955aae1f107db060308f9576b5afffb25a6"} Oct 10 06:44:07 crc kubenswrapper[4822]: I1010 06:44:07.990445 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b5485c95f-w8q56" event={"ID":"0df354f0-e9e8-441a-a676-8a6468b8c191","Type":"ContainerStarted","Data":"3eff87da3d7270cf39f79adfd026a933f6d7bb955dfb31a8855aafe4be4f1832"} Oct 10 06:44:08 crc kubenswrapper[4822]: I1010 06:44:08.370722 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:08 crc kubenswrapper[4822]: I1010 06:44:08.371422 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-central-agent" containerID="cri-o://43936f03c357f9c1d154f2aef74025c5597de4588e7e6b291fa4ba2ff5547c84" gracePeriod=30 Oct 10 06:44:08 crc kubenswrapper[4822]: I1010 06:44:08.371886 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="proxy-httpd" containerID="cri-o://08edf3d006337c1295bc66bb92a7f005b124ccaf1a0099173d6796735225fee7" gracePeriod=30 Oct 10 06:44:08 crc kubenswrapper[4822]: I1010 06:44:08.371952 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="sg-core" containerID="cri-o://f8baf0842acd07291e0bfa649f81cdf8a3eb732ce2a9b364c315a6052e23c0e2" gracePeriod=30 Oct 10 06:44:08 crc kubenswrapper[4822]: I1010 06:44:08.372002 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-notification-agent" containerID="cri-o://504c140e8ee512d9372cfd694641132689ce7cd1e99b2501bbde28cf540967f9" gracePeriod=30 Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.011200 4822 generic.go:334] "Generic (PLEG): container finished" podID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerID="f0f97deab7199ec122d10381e74550fceb86eea97acef347ad6e88fcb9754e6a" exitCode=143 Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.011528 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b3ce8f2-0273-4f1c-b9ab-23b327461cce","Type":"ContainerDied","Data":"f0f97deab7199ec122d10381e74550fceb86eea97acef347ad6e88fcb9754e6a"} Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031763 4822 generic.go:334] "Generic (PLEG): container finished" podID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerID="08edf3d006337c1295bc66bb92a7f005b124ccaf1a0099173d6796735225fee7" exitCode=0 Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031795 4822 generic.go:334] "Generic (PLEG): container finished" podID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerID="f8baf0842acd07291e0bfa649f81cdf8a3eb732ce2a9b364c315a6052e23c0e2" exitCode=2 Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031817 4822 generic.go:334] "Generic (PLEG): container finished" podID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerID="504c140e8ee512d9372cfd694641132689ce7cd1e99b2501bbde28cf540967f9" exitCode=0 Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031823 4822 generic.go:334] "Generic (PLEG): container finished" podID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerID="43936f03c357f9c1d154f2aef74025c5597de4588e7e6b291fa4ba2ff5547c84" exitCode=0 Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031862 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerDied","Data":"08edf3d006337c1295bc66bb92a7f005b124ccaf1a0099173d6796735225fee7"} Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031888 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerDied","Data":"f8baf0842acd07291e0bfa649f81cdf8a3eb732ce2a9b364c315a6052e23c0e2"} Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031898 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerDied","Data":"504c140e8ee512d9372cfd694641132689ce7cd1e99b2501bbde28cf540967f9"} Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.031907 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerDied","Data":"43936f03c357f9c1d154f2aef74025c5597de4588e7e6b291fa4ba2ff5547c84"} Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.033384 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b5485c95f-w8q56" event={"ID":"0df354f0-e9e8-441a-a676-8a6468b8c191","Type":"ContainerStarted","Data":"d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52"} Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.034554 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.034576 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.069522 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6b5485c95f-w8q56" podStartSLOduration=3.069495613 podStartE2EDuration="3.069495613s" podCreationTimestamp="2025-10-10 06:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:44:09.054221156 +0000 UTC m=+1196.149379352" watchObservedRunningTime="2025-10-10 06:44:09.069495613 +0000 UTC m=+1196.164653809" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.096082 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.184082 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217522 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-sg-core-conf-yaml\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217566 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-combined-ca-bundle\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217658 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6mvk\" (UniqueName: \"kubernetes.io/projected/88d23dde-d004-4a04-983e-bd72574a8c0d-kube-api-access-k6mvk\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217682 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-config-data\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-log-httpd\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217819 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-run-httpd\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.217855 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-scripts\") pod \"88d23dde-d004-4a04-983e-bd72574a8c0d\" (UID: \"88d23dde-d004-4a04-983e-bd72574a8c0d\") " Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.220002 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.220672 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.253482 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d23dde-d004-4a04-983e-bd72574a8c0d-kube-api-access-k6mvk" (OuterVolumeSpecName: "kube-api-access-k6mvk") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "kube-api-access-k6mvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.253632 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-scripts" (OuterVolumeSpecName: "scripts") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.265245 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.320014 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.320055 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.320067 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.320080 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6mvk\" (UniqueName: \"kubernetes.io/projected/88d23dde-d004-4a04-983e-bd72574a8c0d-kube-api-access-k6mvk\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.320095 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88d23dde-d004-4a04-983e-bd72574a8c0d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.329519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.370996 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-config-data" (OuterVolumeSpecName: "config-data") pod "88d23dde-d004-4a04-983e-bd72574a8c0d" (UID: "88d23dde-d004-4a04-983e-bd72574a8c0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.421456 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.421492 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d23dde-d004-4a04-983e-bd72574a8c0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:09 crc kubenswrapper[4822]: I1010 06:44:09.671740 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.055374 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88d23dde-d004-4a04-983e-bd72574a8c0d","Type":"ContainerDied","Data":"626a0ad15c171f247406374afa0f611c53b72fe529dfc7eded50bf4d0d5814f4"} Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.055635 4822 scope.go:117] "RemoveContainer" containerID="08edf3d006337c1295bc66bb92a7f005b124ccaf1a0099173d6796735225fee7" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.055397 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.081164 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.099308 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.112933 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:10 crc kubenswrapper[4822]: E1010 06:44:10.113358 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-notification-agent" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113383 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-notification-agent" Oct 10 06:44:10 crc kubenswrapper[4822]: E1010 06:44:10.113414 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-central-agent" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113423 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-central-agent" Oct 10 06:44:10 crc kubenswrapper[4822]: E1010 06:44:10.113437 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="proxy-httpd" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113444 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="proxy-httpd" Oct 10 06:44:10 crc kubenswrapper[4822]: E1010 06:44:10.113457 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="sg-core" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113462 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="sg-core" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113630 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-central-agent" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113644 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="proxy-httpd" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113657 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="ceilometer-notification-agent" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.113673 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" containerName="sg-core" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.115255 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.117249 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.124404 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134126 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx8fm\" (UniqueName: \"kubernetes.io/projected/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-kube-api-access-nx8fm\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134161 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134208 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-run-httpd\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134257 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-log-httpd\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134334 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-scripts\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134396 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-config-data\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.134422 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.143465 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.235905 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.235944 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx8fm\" (UniqueName: \"kubernetes.io/projected/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-kube-api-access-nx8fm\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.235984 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-run-httpd\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.236023 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-log-httpd\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.236046 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-scripts\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.236081 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-config-data\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.236100 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.237404 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-run-httpd\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.238018 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-log-httpd\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.242694 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-scripts\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.254553 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.255122 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.255897 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-config-data\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.257695 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx8fm\" (UniqueName: \"kubernetes.io/projected/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-kube-api-access-nx8fm\") pod \"ceilometer-0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " pod="openstack/ceilometer-0" Oct 10 06:44:10 crc kubenswrapper[4822]: I1010 06:44:10.453555 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:11 crc kubenswrapper[4822]: I1010 06:44:11.667740 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d23dde-d004-4a04-983e-bd72574a8c0d" path="/var/lib/kubelet/pods/88d23dde-d004-4a04-983e-bd72574a8c0d/volumes" Oct 10 06:44:12 crc kubenswrapper[4822]: I1010 06:44:12.076273 4822 generic.go:334] "Generic (PLEG): container finished" podID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerID="b4aa99e351481e0bb7dbfcc43395a335eb68b03ff6aafce0ff40b4b34ffeb8ba" exitCode=0 Oct 10 06:44:12 crc kubenswrapper[4822]: I1010 06:44:12.076318 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b3ce8f2-0273-4f1c-b9ab-23b327461cce","Type":"ContainerDied","Data":"b4aa99e351481e0bb7dbfcc43395a335eb68b03ff6aafce0ff40b4b34ffeb8ba"} Oct 10 06:44:13 crc kubenswrapper[4822]: I1010 06:44:13.679672 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:44:13 crc kubenswrapper[4822]: I1010 06:44:13.731716 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d4866c7b-kfb67"] Oct 10 06:44:13 crc kubenswrapper[4822]: I1010 06:44:13.731971 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d4866c7b-kfb67" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-api" containerID="cri-o://f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095" gracePeriod=30 Oct 10 06:44:13 crc kubenswrapper[4822]: I1010 06:44:13.732449 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d4866c7b-kfb67" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-httpd" containerID="cri-o://54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed" gracePeriod=30 Oct 10 06:44:14 crc kubenswrapper[4822]: I1010 06:44:14.106644 4822 generic.go:334] "Generic (PLEG): container finished" podID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerID="54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed" exitCode=0 Oct 10 06:44:14 crc kubenswrapper[4822]: I1010 06:44:14.106896 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d4866c7b-kfb67" event={"ID":"fd0c36cc-5309-4e57-a9fd-1aecd344b833","Type":"ContainerDied","Data":"54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed"} Oct 10 06:44:14 crc kubenswrapper[4822]: I1010 06:44:14.865416 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.278413 4822 scope.go:117] "RemoveContainer" containerID="f8baf0842acd07291e0bfa649f81cdf8a3eb732ce2a9b364c315a6052e23c0e2" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.497426 4822 scope.go:117] "RemoveContainer" containerID="504c140e8ee512d9372cfd694641132689ce7cd1e99b2501bbde28cf540967f9" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.549378 4822 scope.go:117] "RemoveContainer" containerID="43936f03c357f9c1d154f2aef74025c5597de4588e7e6b291fa4ba2ff5547c84" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.623037 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754588 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-logs\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754644 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-internal-tls-certs\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754708 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-httpd-run\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754791 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-scripts\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754855 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754900 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qxp9\" (UniqueName: \"kubernetes.io/projected/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-kube-api-access-6qxp9\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.754954 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-combined-ca-bundle\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.755032 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-config-data\") pod \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\" (UID: \"6b3ce8f2-0273-4f1c-b9ab-23b327461cce\") " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.755288 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-logs" (OuterVolumeSpecName: "logs") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.755686 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.759242 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-scripts" (OuterVolumeSpecName: "scripts") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.759832 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.760852 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-kube-api-access-6qxp9" (OuterVolumeSpecName: "kube-api-access-6qxp9") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "kube-api-access-6qxp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.761573 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.783084 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.814967 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-config-data" (OuterVolumeSpecName: "config-data") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.820936 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b3ce8f2-0273-4f1c-b9ab-23b327461cce" (UID: "6b3ce8f2-0273-4f1c-b9ab-23b327461cce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.844913 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857652 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857684 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857695 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857704 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857744 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857757 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qxp9\" (UniqueName: \"kubernetes.io/projected/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-kube-api-access-6qxp9\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.857768 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3ce8f2-0273-4f1c-b9ab-23b327461cce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.876103 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 10 06:44:15 crc kubenswrapper[4822]: I1010 06:44:15.959531 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.124671 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c8cd3778-5a5a-483a-af22-5b8420ae896b","Type":"ContainerStarted","Data":"37b4d43e434fbc8b371e9636d76d9af8856d96fcc8463cdd3a2a306557a6b929"} Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.127534 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerStarted","Data":"6402738bc9c80b40fd580dc154ed4bc0731f2904a9f87efb4a6e372bba5bdb45"} Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.130942 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b3ce8f2-0273-4f1c-b9ab-23b327461cce","Type":"ContainerDied","Data":"df54d7695e6d9e19dfe5b752cb8933d915e122251669c1e7350d7995c242c312"} Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.130979 4822 scope.go:117] "RemoveContainer" containerID="b4aa99e351481e0bb7dbfcc43395a335eb68b03ff6aafce0ff40b4b34ffeb8ba" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.131067 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.163049 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.484948455 podStartE2EDuration="13.163033346s" podCreationTimestamp="2025-10-10 06:44:03 +0000 UTC" firstStartedPulling="2025-10-10 06:44:04.670993661 +0000 UTC m=+1191.766151857" lastFinishedPulling="2025-10-10 06:44:15.349078552 +0000 UTC m=+1202.444236748" observedRunningTime="2025-10-10 06:44:16.155885377 +0000 UTC m=+1203.251043583" watchObservedRunningTime="2025-10-10 06:44:16.163033346 +0000 UTC m=+1203.258191542" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.168194 4822 scope.go:117] "RemoveContainer" containerID="f0f97deab7199ec122d10381e74550fceb86eea97acef347ad6e88fcb9754e6a" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.177898 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.187547 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.205136 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:44:16 crc kubenswrapper[4822]: E1010 06:44:16.205601 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-httpd" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.205624 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-httpd" Oct 10 06:44:16 crc kubenswrapper[4822]: E1010 06:44:16.205645 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-log" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.205653 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-log" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.205887 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-log" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.205916 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" containerName="glance-httpd" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.207067 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.210502 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.211768 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.226362 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.376827 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.376917 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.376966 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.376990 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.377021 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.377044 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.377086 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.377130 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4k2g\" (UniqueName: \"kubernetes.io/projected/d5180126-ac55-464c-90dd-565daffba54c-kube-api-access-v4k2g\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.478818 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.478869 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.478918 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.478967 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4k2g\" (UniqueName: \"kubernetes.io/projected/d5180126-ac55-464c-90dd-565daffba54c-kube-api-access-v4k2g\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.479003 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.479034 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.479064 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.479082 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.480152 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.480592 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.481159 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.499447 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.504022 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.504210 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.504540 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.506958 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4k2g\" (UniqueName: \"kubernetes.io/projected/d5180126-ac55-464c-90dd-565daffba54c-kube-api-access-v4k2g\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.533404 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.607135 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.733525 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.733777 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-log" containerID="cri-o://573ef14ace4b4f889d3c0ab84ad4f3bfe8b82904867dcb11b92e9e04da75f069" gracePeriod=30 Oct 10 06:44:16 crc kubenswrapper[4822]: I1010 06:44:16.733945 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-httpd" containerID="cri-o://bbc088dc97d4b3f85b5d015621c645d96e6edf8aab87d72f1d14921dc62dc983" gracePeriod=30 Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.012114 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.036649 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.150694 4822 generic.go:334] "Generic (PLEG): container finished" podID="93b88728-7582-4106-8d27-cf2644ca1960" containerID="573ef14ace4b4f889d3c0ab84ad4f3bfe8b82904867dcb11b92e9e04da75f069" exitCode=143 Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.150789 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"93b88728-7582-4106-8d27-cf2644ca1960","Type":"ContainerDied","Data":"573ef14ace4b4f889d3c0ab84ad4f3bfe8b82904867dcb11b92e9e04da75f069"} Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.156056 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerStarted","Data":"3e7d276d1d3b3fb474bd5e9b35156a0cb9b7aeb19b8240310ff38e0f5a17fcf9"} Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.275589 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.667086 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3ce8f2-0273-4f1c-b9ab-23b327461cce" path="/var/lib/kubelet/pods/6b3ce8f2-0273-4f1c-b9ab-23b327461cce/volumes" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.725947 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.829869 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-config\") pod \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.829926 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-httpd-config\") pod \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.829952 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-combined-ca-bundle\") pod \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.830078 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcnzj\" (UniqueName: \"kubernetes.io/projected/fd0c36cc-5309-4e57-a9fd-1aecd344b833-kube-api-access-zcnzj\") pod \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.830146 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-ovndb-tls-certs\") pod \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\" (UID: \"fd0c36cc-5309-4e57-a9fd-1aecd344b833\") " Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.838460 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fd0c36cc-5309-4e57-a9fd-1aecd344b833" (UID: "fd0c36cc-5309-4e57-a9fd-1aecd344b833"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.838871 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0c36cc-5309-4e57-a9fd-1aecd344b833-kube-api-access-zcnzj" (OuterVolumeSpecName: "kube-api-access-zcnzj") pod "fd0c36cc-5309-4e57-a9fd-1aecd344b833" (UID: "fd0c36cc-5309-4e57-a9fd-1aecd344b833"). InnerVolumeSpecName "kube-api-access-zcnzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.892546 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-config" (OuterVolumeSpecName: "config") pod "fd0c36cc-5309-4e57-a9fd-1aecd344b833" (UID: "fd0c36cc-5309-4e57-a9fd-1aecd344b833"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.900345 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd0c36cc-5309-4e57-a9fd-1aecd344b833" (UID: "fd0c36cc-5309-4e57-a9fd-1aecd344b833"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.931683 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.931714 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.931728 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.931739 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcnzj\" (UniqueName: \"kubernetes.io/projected/fd0c36cc-5309-4e57-a9fd-1aecd344b833-kube-api-access-zcnzj\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:17 crc kubenswrapper[4822]: I1010 06:44:17.953005 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fd0c36cc-5309-4e57-a9fd-1aecd344b833" (UID: "fd0c36cc-5309-4e57-a9fd-1aecd344b833"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.035233 4822 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0c36cc-5309-4e57-a9fd-1aecd344b833-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.185322 4822 generic.go:334] "Generic (PLEG): container finished" podID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerID="f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095" exitCode=0 Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.185510 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d4866c7b-kfb67" event={"ID":"fd0c36cc-5309-4e57-a9fd-1aecd344b833","Type":"ContainerDied","Data":"f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095"} Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.185626 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d4866c7b-kfb67" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.185745 4822 scope.go:117] "RemoveContainer" containerID="54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.185728 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d4866c7b-kfb67" event={"ID":"fd0c36cc-5309-4e57-a9fd-1aecd344b833","Type":"ContainerDied","Data":"fff7cf015b33f2a5880a93e456e248883822a80932a784f4d93f331c588a08ae"} Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.189281 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerStarted","Data":"9928b4bb5c7c846b4d03b70516ed5ca072768b4d3aa864a47b1164b201b3d55f"} Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.189319 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerStarted","Data":"9e8a166db7e6684c2f70810fcd5fd1e2bda563d063aea0131ffcf24561a3576f"} Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.190591 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5180126-ac55-464c-90dd-565daffba54c","Type":"ContainerStarted","Data":"ff627b4112d5afa4366e1239ee8cc64edf816c0effeb7dafc7e69ae5cc1194ec"} Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.190618 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5180126-ac55-464c-90dd-565daffba54c","Type":"ContainerStarted","Data":"d554140da2cdf5519348ac91880f18fdafb0e4c11af74d8ffaba26f817dabc77"} Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.245313 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d4866c7b-kfb67"] Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.264136 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7d4866c7b-kfb67"] Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.309690 4822 scope.go:117] "RemoveContainer" containerID="f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.362146 4822 scope.go:117] "RemoveContainer" containerID="54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed" Oct 10 06:44:18 crc kubenswrapper[4822]: E1010 06:44:18.363969 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed\": container with ID starting with 54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed not found: ID does not exist" containerID="54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.364023 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed"} err="failed to get container status \"54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed\": rpc error: code = NotFound desc = could not find container \"54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed\": container with ID starting with 54dcd0c071eb30c932a2eb872332e9539b5b14e55b6ed2a2f342d7116d8789ed not found: ID does not exist" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.364056 4822 scope.go:117] "RemoveContainer" containerID="f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095" Oct 10 06:44:18 crc kubenswrapper[4822]: E1010 06:44:18.365086 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095\": container with ID starting with f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095 not found: ID does not exist" containerID="f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095" Oct 10 06:44:18 crc kubenswrapper[4822]: I1010 06:44:18.365119 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095"} err="failed to get container status \"f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095\": rpc error: code = NotFound desc = could not find container \"f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095\": container with ID starting with f84b7af63d3f4450685f6f690af5d371d401150b05bda1f4daa68d0bfd861095 not found: ID does not exist" Oct 10 06:44:19 crc kubenswrapper[4822]: I1010 06:44:19.201831 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5180126-ac55-464c-90dd-565daffba54c","Type":"ContainerStarted","Data":"80843954eb6d7ec548593d47ef32a3ec88c616382899f1b6da51f373178452d9"} Oct 10 06:44:19 crc kubenswrapper[4822]: I1010 06:44:19.229767 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.229745209 podStartE2EDuration="3.229745209s" podCreationTimestamp="2025-10-10 06:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:44:19.223264379 +0000 UTC m=+1206.318422585" watchObservedRunningTime="2025-10-10 06:44:19.229745209 +0000 UTC m=+1206.324903415" Oct 10 06:44:19 crc kubenswrapper[4822]: I1010 06:44:19.659563 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" path="/var/lib/kubelet/pods/fd0c36cc-5309-4e57-a9fd-1aecd344b833/volumes" Oct 10 06:44:20 crc kubenswrapper[4822]: I1010 06:44:20.215630 4822 generic.go:334] "Generic (PLEG): container finished" podID="93b88728-7582-4106-8d27-cf2644ca1960" containerID="bbc088dc97d4b3f85b5d015621c645d96e6edf8aab87d72f1d14921dc62dc983" exitCode=0 Oct 10 06:44:20 crc kubenswrapper[4822]: I1010 06:44:20.215979 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"93b88728-7582-4106-8d27-cf2644ca1960","Type":"ContainerDied","Data":"bbc088dc97d4b3f85b5d015621c645d96e6edf8aab87d72f1d14921dc62dc983"} Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.230513 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerStarted","Data":"df00ac7cd3316056516aa5090f987440f7eb094e9585dd62542db77e5346e015"} Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.231260 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-central-agent" containerID="cri-o://3e7d276d1d3b3fb474bd5e9b35156a0cb9b7aeb19b8240310ff38e0f5a17fcf9" gracePeriod=30 Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.231543 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.231932 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="proxy-httpd" containerID="cri-o://df00ac7cd3316056516aa5090f987440f7eb094e9585dd62542db77e5346e015" gracePeriod=30 Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.232017 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="sg-core" containerID="cri-o://9928b4bb5c7c846b4d03b70516ed5ca072768b4d3aa864a47b1164b201b3d55f" gracePeriod=30 Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.232111 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-notification-agent" containerID="cri-o://9e8a166db7e6684c2f70810fcd5fd1e2bda563d063aea0131ffcf24561a3576f" gracePeriod=30 Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.268821 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.185965088 podStartE2EDuration="11.268785594s" podCreationTimestamp="2025-10-10 06:44:10 +0000 UTC" firstStartedPulling="2025-10-10 06:44:15.845519921 +0000 UTC m=+1202.940678117" lastFinishedPulling="2025-10-10 06:44:20.928340427 +0000 UTC m=+1208.023498623" observedRunningTime="2025-10-10 06:44:21.257612857 +0000 UTC m=+1208.352771083" watchObservedRunningTime="2025-10-10 06:44:21.268785594 +0000 UTC m=+1208.363943800" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.392960 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424193 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-config-data\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424279 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-public-tls-certs\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424379 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-logs\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424422 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47zrn\" (UniqueName: \"kubernetes.io/projected/93b88728-7582-4106-8d27-cf2644ca1960-kube-api-access-47zrn\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424475 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-combined-ca-bundle\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424507 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424570 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-scripts\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424645 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-httpd-run\") pod \"93b88728-7582-4106-8d27-cf2644ca1960\" (UID: \"93b88728-7582-4106-8d27-cf2644ca1960\") " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.424918 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-logs" (OuterVolumeSpecName: "logs") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.425227 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.425447 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.425472 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93b88728-7582-4106-8d27-cf2644ca1960-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.434387 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-scripts" (OuterVolumeSpecName: "scripts") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.440888 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.450998 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b88728-7582-4106-8d27-cf2644ca1960-kube-api-access-47zrn" (OuterVolumeSpecName: "kube-api-access-47zrn") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "kube-api-access-47zrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.456976 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.505996 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-config-data" (OuterVolumeSpecName: "config-data") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.513749 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "93b88728-7582-4106-8d27-cf2644ca1960" (UID: "93b88728-7582-4106-8d27-cf2644ca1960"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.529026 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.529267 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.529277 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.529288 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47zrn\" (UniqueName: \"kubernetes.io/projected/93b88728-7582-4106-8d27-cf2644ca1960-kube-api-access-47zrn\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.529297 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b88728-7582-4106-8d27-cf2644ca1960-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.529328 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.547281 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 10 06:44:21 crc kubenswrapper[4822]: I1010 06:44:21.630841 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.241880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"93b88728-7582-4106-8d27-cf2644ca1960","Type":"ContainerDied","Data":"94305d8fe92a8e1899f3745ac1a95b787ab2309fd904d0ea7066c788bba80e0b"} Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.241943 4822 scope.go:117] "RemoveContainer" containerID="bbc088dc97d4b3f85b5d015621c645d96e6edf8aab87d72f1d14921dc62dc983" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.242088 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.249293 4822 generic.go:334] "Generic (PLEG): container finished" podID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerID="9928b4bb5c7c846b4d03b70516ed5ca072768b4d3aa864a47b1164b201b3d55f" exitCode=2 Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.249332 4822 generic.go:334] "Generic (PLEG): container finished" podID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerID="9e8a166db7e6684c2f70810fcd5fd1e2bda563d063aea0131ffcf24561a3576f" exitCode=0 Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.249343 4822 generic.go:334] "Generic (PLEG): container finished" podID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerID="3e7d276d1d3b3fb474bd5e9b35156a0cb9b7aeb19b8240310ff38e0f5a17fcf9" exitCode=0 Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.249365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerDied","Data":"9928b4bb5c7c846b4d03b70516ed5ca072768b4d3aa864a47b1164b201b3d55f"} Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.249394 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerDied","Data":"9e8a166db7e6684c2f70810fcd5fd1e2bda563d063aea0131ffcf24561a3576f"} Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.249408 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerDied","Data":"3e7d276d1d3b3fb474bd5e9b35156a0cb9b7aeb19b8240310ff38e0f5a17fcf9"} Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.268613 4822 scope.go:117] "RemoveContainer" containerID="573ef14ace4b4f889d3c0ab84ad4f3bfe8b82904867dcb11b92e9e04da75f069" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.278389 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.303620 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313190 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:44:22 crc kubenswrapper[4822]: E1010 06:44:22.313625 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-log" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313649 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-log" Oct 10 06:44:22 crc kubenswrapper[4822]: E1010 06:44:22.313678 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-httpd" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313686 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-httpd" Oct 10 06:44:22 crc kubenswrapper[4822]: E1010 06:44:22.313716 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-api" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313724 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-api" Oct 10 06:44:22 crc kubenswrapper[4822]: E1010 06:44:22.313751 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-httpd" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313759 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-httpd" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313959 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-httpd" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.313979 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-api" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.314000 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0c36cc-5309-4e57-a9fd-1aecd344b833" containerName="neutron-httpd" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.314021 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b88728-7582-4106-8d27-cf2644ca1960" containerName="glance-log" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.316836 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.318993 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.320883 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.322175 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.344967 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345013 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4r9\" (UniqueName: \"kubernetes.io/projected/14ce9853-109f-456d-b51c-b1d11072a90d-kube-api-access-2m4r9\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345044 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345082 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-logs\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345101 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345183 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345205 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.345230 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.446211 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.446638 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.446251 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.446958 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447009 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447044 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4r9\" (UniqueName: \"kubernetes.io/projected/14ce9853-109f-456d-b51c-b1d11072a90d-kube-api-access-2m4r9\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447095 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447126 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-logs\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447189 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447449 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.447540 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-logs\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.451575 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.451776 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.451791 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.453397 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.464023 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4r9\" (UniqueName: \"kubernetes.io/projected/14ce9853-109f-456d-b51c-b1d11072a90d-kube-api-access-2m4r9\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.476027 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " pod="openstack/glance-default-external-api-0" Oct 10 06:44:22 crc kubenswrapper[4822]: I1010 06:44:22.644589 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:44:23 crc kubenswrapper[4822]: I1010 06:44:23.192003 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:44:23 crc kubenswrapper[4822]: W1010 06:44:23.240897 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ce9853_109f_456d_b51c_b1d11072a90d.slice/crio-bd27b46ad44fbb693d221e97cb884bb4bdeb820c1a9927b20bb8f06d919ad15d WatchSource:0}: Error finding container bd27b46ad44fbb693d221e97cb884bb4bdeb820c1a9927b20bb8f06d919ad15d: Status 404 returned error can't find the container with id bd27b46ad44fbb693d221e97cb884bb4bdeb820c1a9927b20bb8f06d919ad15d Oct 10 06:44:23 crc kubenswrapper[4822]: I1010 06:44:23.267704 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"14ce9853-109f-456d-b51c-b1d11072a90d","Type":"ContainerStarted","Data":"bd27b46ad44fbb693d221e97cb884bb4bdeb820c1a9927b20bb8f06d919ad15d"} Oct 10 06:44:23 crc kubenswrapper[4822]: I1010 06:44:23.678511 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b88728-7582-4106-8d27-cf2644ca1960" path="/var/lib/kubelet/pods/93b88728-7582-4106-8d27-cf2644ca1960/volumes" Oct 10 06:44:24 crc kubenswrapper[4822]: I1010 06:44:24.281559 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"14ce9853-109f-456d-b51c-b1d11072a90d","Type":"ContainerStarted","Data":"9baba77c235fa2da53a1933646bcf77963daf0d1bceccb6370d40760af07bd34"} Oct 10 06:44:25 crc kubenswrapper[4822]: I1010 06:44:25.294727 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"14ce9853-109f-456d-b51c-b1d11072a90d","Type":"ContainerStarted","Data":"0f9a24b9406d92324b02f103bd4f78da68951fba025fb39c169f20c3b804f0f8"} Oct 10 06:44:25 crc kubenswrapper[4822]: I1010 06:44:25.320178 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.320150623 podStartE2EDuration="3.320150623s" podCreationTimestamp="2025-10-10 06:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:44:25.316676162 +0000 UTC m=+1212.411834348" watchObservedRunningTime="2025-10-10 06:44:25.320150623 +0000 UTC m=+1212.415308839" Oct 10 06:44:26 crc kubenswrapper[4822]: I1010 06:44:26.608395 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:26 crc kubenswrapper[4822]: I1010 06:44:26.608749 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:26 crc kubenswrapper[4822]: I1010 06:44:26.638957 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:26 crc kubenswrapper[4822]: I1010 06:44:26.659612 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:27 crc kubenswrapper[4822]: I1010 06:44:27.310498 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:27 crc kubenswrapper[4822]: I1010 06:44:27.310864 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:29 crc kubenswrapper[4822]: I1010 06:44:29.503289 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:29 crc kubenswrapper[4822]: I1010 06:44:29.503784 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:44:29 crc kubenswrapper[4822]: I1010 06:44:29.504274 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.560144 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rxtp5"] Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.562509 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.581053 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rxtp5"] Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.660273 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qn8pd"] Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.661315 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.665692 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qn8pd"] Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.729000 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp5jn\" (UniqueName: \"kubernetes.io/projected/175b9473-ce90-4e92-8322-f64cccbcb54b-kube-api-access-mp5jn\") pod \"nova-api-db-create-rxtp5\" (UID: \"175b9473-ce90-4e92-8322-f64cccbcb54b\") " pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.764135 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-v88zt"] Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.765639 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.775132 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-v88zt"] Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.830779 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp5jn\" (UniqueName: \"kubernetes.io/projected/175b9473-ce90-4e92-8322-f64cccbcb54b-kube-api-access-mp5jn\") pod \"nova-api-db-create-rxtp5\" (UID: \"175b9473-ce90-4e92-8322-f64cccbcb54b\") " pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.830957 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56h64\" (UniqueName: \"kubernetes.io/projected/6eb1f8a4-2601-4901-a55c-95fb18a9613c-kube-api-access-56h64\") pod \"nova-cell0-db-create-qn8pd\" (UID: \"6eb1f8a4-2601-4901-a55c-95fb18a9613c\") " pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.865161 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp5jn\" (UniqueName: \"kubernetes.io/projected/175b9473-ce90-4e92-8322-f64cccbcb54b-kube-api-access-mp5jn\") pod \"nova-api-db-create-rxtp5\" (UID: \"175b9473-ce90-4e92-8322-f64cccbcb54b\") " pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.888836 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.933300 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56h64\" (UniqueName: \"kubernetes.io/projected/6eb1f8a4-2601-4901-a55c-95fb18a9613c-kube-api-access-56h64\") pod \"nova-cell0-db-create-qn8pd\" (UID: \"6eb1f8a4-2601-4901-a55c-95fb18a9613c\") " pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.938077 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64llh\" (UniqueName: \"kubernetes.io/projected/b8447229-0ca7-47ad-9404-c84da379670f-kube-api-access-64llh\") pod \"nova-cell1-db-create-v88zt\" (UID: \"b8447229-0ca7-47ad-9404-c84da379670f\") " pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.959234 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56h64\" (UniqueName: \"kubernetes.io/projected/6eb1f8a4-2601-4901-a55c-95fb18a9613c-kube-api-access-56h64\") pod \"nova-cell0-db-create-qn8pd\" (UID: \"6eb1f8a4-2601-4901-a55c-95fb18a9613c\") " pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:31 crc kubenswrapper[4822]: I1010 06:44:31.982430 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.051029 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64llh\" (UniqueName: \"kubernetes.io/projected/b8447229-0ca7-47ad-9404-c84da379670f-kube-api-access-64llh\") pod \"nova-cell1-db-create-v88zt\" (UID: \"b8447229-0ca7-47ad-9404-c84da379670f\") " pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.082058 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64llh\" (UniqueName: \"kubernetes.io/projected/b8447229-0ca7-47ad-9404-c84da379670f-kube-api-access-64llh\") pod \"nova-cell1-db-create-v88zt\" (UID: \"b8447229-0ca7-47ad-9404-c84da379670f\") " pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.085296 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.604857 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rxtp5"] Oct 10 06:44:32 crc kubenswrapper[4822]: W1010 06:44:32.611287 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod175b9473_ce90_4e92_8322_f64cccbcb54b.slice/crio-a553fa778e9c454b6113cebd8841bbf93deea5304f841d17b3e1574117448417 WatchSource:0}: Error finding container a553fa778e9c454b6113cebd8841bbf93deea5304f841d17b3e1574117448417: Status 404 returned error can't find the container with id a553fa778e9c454b6113cebd8841bbf93deea5304f841d17b3e1574117448417 Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.645897 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.645951 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.693738 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qn8pd"] Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.697496 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.700735 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-v88zt"] Oct 10 06:44:32 crc kubenswrapper[4822]: W1010 06:44:32.705862 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8447229_0ca7_47ad_9404_c84da379670f.slice/crio-c3f150600fa72b038424c573027b014c4a06b03a07f4e379bf8d797c5469b4fc WatchSource:0}: Error finding container c3f150600fa72b038424c573027b014c4a06b03a07f4e379bf8d797c5469b4fc: Status 404 returned error can't find the container with id c3f150600fa72b038424c573027b014c4a06b03a07f4e379bf8d797c5469b4fc Oct 10 06:44:32 crc kubenswrapper[4822]: W1010 06:44:32.711028 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eb1f8a4_2601_4901_a55c_95fb18a9613c.slice/crio-955ad746ee36a22dfe12d309b3b5f3fe87779acba6f46fb8317b0e7ad35d98f6 WatchSource:0}: Error finding container 955ad746ee36a22dfe12d309b3b5f3fe87779acba6f46fb8317b0e7ad35d98f6: Status 404 returned error can't find the container with id 955ad746ee36a22dfe12d309b3b5f3fe87779acba6f46fb8317b0e7ad35d98f6 Oct 10 06:44:32 crc kubenswrapper[4822]: I1010 06:44:32.743749 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.368691 4822 generic.go:334] "Generic (PLEG): container finished" podID="b8447229-0ca7-47ad-9404-c84da379670f" containerID="d316ea091d0d3fd74e111a4ac49ba83ff68275a343bdfa35aaa4a262cd3d37ff" exitCode=0 Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.368758 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v88zt" event={"ID":"b8447229-0ca7-47ad-9404-c84da379670f","Type":"ContainerDied","Data":"d316ea091d0d3fd74e111a4ac49ba83ff68275a343bdfa35aaa4a262cd3d37ff"} Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.368783 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v88zt" event={"ID":"b8447229-0ca7-47ad-9404-c84da379670f","Type":"ContainerStarted","Data":"c3f150600fa72b038424c573027b014c4a06b03a07f4e379bf8d797c5469b4fc"} Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.370411 4822 generic.go:334] "Generic (PLEG): container finished" podID="6eb1f8a4-2601-4901-a55c-95fb18a9613c" containerID="53d9ccc3b552464384024fafe8b9e9476990f0e498f5251b3ebd572bda3dd7c2" exitCode=0 Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.370518 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qn8pd" event={"ID":"6eb1f8a4-2601-4901-a55c-95fb18a9613c","Type":"ContainerDied","Data":"53d9ccc3b552464384024fafe8b9e9476990f0e498f5251b3ebd572bda3dd7c2"} Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.370598 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qn8pd" event={"ID":"6eb1f8a4-2601-4901-a55c-95fb18a9613c","Type":"ContainerStarted","Data":"955ad746ee36a22dfe12d309b3b5f3fe87779acba6f46fb8317b0e7ad35d98f6"} Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.372282 4822 generic.go:334] "Generic (PLEG): container finished" podID="175b9473-ce90-4e92-8322-f64cccbcb54b" containerID="2236649086e06115ee6f2c2d259dcd51713877ec4cfa78014abba7e69c828377" exitCode=0 Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.372947 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rxtp5" event={"ID":"175b9473-ce90-4e92-8322-f64cccbcb54b","Type":"ContainerDied","Data":"2236649086e06115ee6f2c2d259dcd51713877ec4cfa78014abba7e69c828377"} Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.372985 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rxtp5" event={"ID":"175b9473-ce90-4e92-8322-f64cccbcb54b","Type":"ContainerStarted","Data":"a553fa778e9c454b6113cebd8841bbf93deea5304f841d17b3e1574117448417"} Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.373006 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 06:44:33 crc kubenswrapper[4822]: I1010 06:44:33.373163 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 06:44:34 crc kubenswrapper[4822]: I1010 06:44:34.861114 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:34 crc kubenswrapper[4822]: I1010 06:44:34.869218 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:34 crc kubenswrapper[4822]: I1010 06:44:34.893762 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.023634 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp5jn\" (UniqueName: \"kubernetes.io/projected/175b9473-ce90-4e92-8322-f64cccbcb54b-kube-api-access-mp5jn\") pod \"175b9473-ce90-4e92-8322-f64cccbcb54b\" (UID: \"175b9473-ce90-4e92-8322-f64cccbcb54b\") " Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.023686 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64llh\" (UniqueName: \"kubernetes.io/projected/b8447229-0ca7-47ad-9404-c84da379670f-kube-api-access-64llh\") pod \"b8447229-0ca7-47ad-9404-c84da379670f\" (UID: \"b8447229-0ca7-47ad-9404-c84da379670f\") " Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.023841 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56h64\" (UniqueName: \"kubernetes.io/projected/6eb1f8a4-2601-4901-a55c-95fb18a9613c-kube-api-access-56h64\") pod \"6eb1f8a4-2601-4901-a55c-95fb18a9613c\" (UID: \"6eb1f8a4-2601-4901-a55c-95fb18a9613c\") " Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.030051 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8447229-0ca7-47ad-9404-c84da379670f-kube-api-access-64llh" (OuterVolumeSpecName: "kube-api-access-64llh") pod "b8447229-0ca7-47ad-9404-c84da379670f" (UID: "b8447229-0ca7-47ad-9404-c84da379670f"). InnerVolumeSpecName "kube-api-access-64llh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.031317 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175b9473-ce90-4e92-8322-f64cccbcb54b-kube-api-access-mp5jn" (OuterVolumeSpecName: "kube-api-access-mp5jn") pod "175b9473-ce90-4e92-8322-f64cccbcb54b" (UID: "175b9473-ce90-4e92-8322-f64cccbcb54b"). InnerVolumeSpecName "kube-api-access-mp5jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.036134 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb1f8a4-2601-4901-a55c-95fb18a9613c-kube-api-access-56h64" (OuterVolumeSpecName: "kube-api-access-56h64") pod "6eb1f8a4-2601-4901-a55c-95fb18a9613c" (UID: "6eb1f8a4-2601-4901-a55c-95fb18a9613c"). InnerVolumeSpecName "kube-api-access-56h64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.130732 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56h64\" (UniqueName: \"kubernetes.io/projected/6eb1f8a4-2601-4901-a55c-95fb18a9613c-kube-api-access-56h64\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.130774 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp5jn\" (UniqueName: \"kubernetes.io/projected/175b9473-ce90-4e92-8322-f64cccbcb54b-kube-api-access-mp5jn\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.130783 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64llh\" (UniqueName: \"kubernetes.io/projected/b8447229-0ca7-47ad-9404-c84da379670f-kube-api-access-64llh\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.390050 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v88zt" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.390049 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v88zt" event={"ID":"b8447229-0ca7-47ad-9404-c84da379670f","Type":"ContainerDied","Data":"c3f150600fa72b038424c573027b014c4a06b03a07f4e379bf8d797c5469b4fc"} Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.390105 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f150600fa72b038424c573027b014c4a06b03a07f4e379bf8d797c5469b4fc" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.391770 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qn8pd" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.392016 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qn8pd" event={"ID":"6eb1f8a4-2601-4901-a55c-95fb18a9613c","Type":"ContainerDied","Data":"955ad746ee36a22dfe12d309b3b5f3fe87779acba6f46fb8317b0e7ad35d98f6"} Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.392046 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955ad746ee36a22dfe12d309b3b5f3fe87779acba6f46fb8317b0e7ad35d98f6" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.396665 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rxtp5" event={"ID":"175b9473-ce90-4e92-8322-f64cccbcb54b","Type":"ContainerDied","Data":"a553fa778e9c454b6113cebd8841bbf93deea5304f841d17b3e1574117448417"} Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.396703 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxtp5" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.396714 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a553fa778e9c454b6113cebd8841bbf93deea5304f841d17b3e1574117448417" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.415214 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.415308 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:44:35 crc kubenswrapper[4822]: I1010 06:44:35.506023 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 06:44:40 crc kubenswrapper[4822]: I1010 06:44:40.459320 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.808734 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bdb6-account-create-9kt24"] Oct 10 06:44:41 crc kubenswrapper[4822]: E1010 06:44:41.812095 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8447229-0ca7-47ad-9404-c84da379670f" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.812110 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8447229-0ca7-47ad-9404-c84da379670f" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: E1010 06:44:41.812197 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb1f8a4-2601-4901-a55c-95fb18a9613c" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.812239 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb1f8a4-2601-4901-a55c-95fb18a9613c" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: E1010 06:44:41.812309 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175b9473-ce90-4e92-8322-f64cccbcb54b" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.812315 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="175b9473-ce90-4e92-8322-f64cccbcb54b" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.812866 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8447229-0ca7-47ad-9404-c84da379670f" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.812899 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="175b9473-ce90-4e92-8322-f64cccbcb54b" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.812919 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb1f8a4-2601-4901-a55c-95fb18a9613c" containerName="mariadb-database-create" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.813747 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.821447 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bdb6-account-create-9kt24"] Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.840822 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.863745 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gnz\" (UniqueName: \"kubernetes.io/projected/8e6c12c1-b88d-4488-9660-499070fbea2c-kube-api-access-g2gnz\") pod \"nova-api-bdb6-account-create-9kt24\" (UID: \"8e6c12c1-b88d-4488-9660-499070fbea2c\") " pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.966012 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gnz\" (UniqueName: \"kubernetes.io/projected/8e6c12c1-b88d-4488-9660-499070fbea2c-kube-api-access-g2gnz\") pod \"nova-api-bdb6-account-create-9kt24\" (UID: \"8e6c12c1-b88d-4488-9660-499070fbea2c\") " pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.988581 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7142-account-create-bmfp7"] Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.989797 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.998459 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.998863 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gnz\" (UniqueName: \"kubernetes.io/projected/8e6c12c1-b88d-4488-9660-499070fbea2c-kube-api-access-g2gnz\") pod \"nova-api-bdb6-account-create-9kt24\" (UID: \"8e6c12c1-b88d-4488-9660-499070fbea2c\") " pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:41 crc kubenswrapper[4822]: I1010 06:44:41.999686 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7142-account-create-bmfp7"] Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.068447 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hq9\" (UniqueName: \"kubernetes.io/projected/36b650a5-7a88-47df-a02f-87dc3eee6f89-kube-api-access-l5hq9\") pod \"nova-cell0-7142-account-create-bmfp7\" (UID: \"36b650a5-7a88-47df-a02f-87dc3eee6f89\") " pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.163315 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.170301 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hq9\" (UniqueName: \"kubernetes.io/projected/36b650a5-7a88-47df-a02f-87dc3eee6f89-kube-api-access-l5hq9\") pod \"nova-cell0-7142-account-create-bmfp7\" (UID: \"36b650a5-7a88-47df-a02f-87dc3eee6f89\") " pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.184209 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1f81-account-create-66lbl"] Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.185958 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.190470 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.196677 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hq9\" (UniqueName: \"kubernetes.io/projected/36b650a5-7a88-47df-a02f-87dc3eee6f89-kube-api-access-l5hq9\") pod \"nova-cell0-7142-account-create-bmfp7\" (UID: \"36b650a5-7a88-47df-a02f-87dc3eee6f89\") " pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.204512 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f81-account-create-66lbl"] Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.275858 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rp2t\" (UniqueName: \"kubernetes.io/projected/3b05baa2-bc07-4801-a944-30ee51acd6c5-kube-api-access-7rp2t\") pod \"nova-cell1-1f81-account-create-66lbl\" (UID: \"3b05baa2-bc07-4801-a944-30ee51acd6c5\") " pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.353365 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.377771 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rp2t\" (UniqueName: \"kubernetes.io/projected/3b05baa2-bc07-4801-a944-30ee51acd6c5-kube-api-access-7rp2t\") pod \"nova-cell1-1f81-account-create-66lbl\" (UID: \"3b05baa2-bc07-4801-a944-30ee51acd6c5\") " pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.401546 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rp2t\" (UniqueName: \"kubernetes.io/projected/3b05baa2-bc07-4801-a944-30ee51acd6c5-kube-api-access-7rp2t\") pod \"nova-cell1-1f81-account-create-66lbl\" (UID: \"3b05baa2-bc07-4801-a944-30ee51acd6c5\") " pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.591149 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.598514 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bdb6-account-create-9kt24"] Oct 10 06:44:42 crc kubenswrapper[4822]: W1010 06:44:42.613824 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e6c12c1_b88d_4488_9660_499070fbea2c.slice/crio-1de777b88c5217ace5e7a38d390580fb813f096ff63e5ba3602fe257ca6dbc20 WatchSource:0}: Error finding container 1de777b88c5217ace5e7a38d390580fb813f096ff63e5ba3602fe257ca6dbc20: Status 404 returned error can't find the container with id 1de777b88c5217ace5e7a38d390580fb813f096ff63e5ba3602fe257ca6dbc20 Oct 10 06:44:42 crc kubenswrapper[4822]: I1010 06:44:42.764176 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7142-account-create-bmfp7"] Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.014789 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f81-account-create-66lbl"] Oct 10 06:44:43 crc kubenswrapper[4822]: W1010 06:44:43.063753 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b05baa2_bc07_4801_a944_30ee51acd6c5.slice/crio-33a2fd32b3ce547f05604332a539a4030cb4753145d490ed71506c2aaa12f68d WatchSource:0}: Error finding container 33a2fd32b3ce547f05604332a539a4030cb4753145d490ed71506c2aaa12f68d: Status 404 returned error can't find the container with id 33a2fd32b3ce547f05604332a539a4030cb4753145d490ed71506c2aaa12f68d Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.472562 4822 generic.go:334] "Generic (PLEG): container finished" podID="8e6c12c1-b88d-4488-9660-499070fbea2c" containerID="e4e5f571245440132f9f5f730db406d6b9a307ca592fcc9f54d08211bd30cd68" exitCode=0 Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.472869 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bdb6-account-create-9kt24" event={"ID":"8e6c12c1-b88d-4488-9660-499070fbea2c","Type":"ContainerDied","Data":"e4e5f571245440132f9f5f730db406d6b9a307ca592fcc9f54d08211bd30cd68"} Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.472909 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bdb6-account-create-9kt24" event={"ID":"8e6c12c1-b88d-4488-9660-499070fbea2c","Type":"ContainerStarted","Data":"1de777b88c5217ace5e7a38d390580fb813f096ff63e5ba3602fe257ca6dbc20"} Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.474526 4822 generic.go:334] "Generic (PLEG): container finished" podID="36b650a5-7a88-47df-a02f-87dc3eee6f89" containerID="164f4d17f67c30c9f98fb7fe3fea3e4c324c7f1a7a425e0992fdc9a6d2c119e1" exitCode=0 Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.474589 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7142-account-create-bmfp7" event={"ID":"36b650a5-7a88-47df-a02f-87dc3eee6f89","Type":"ContainerDied","Data":"164f4d17f67c30c9f98fb7fe3fea3e4c324c7f1a7a425e0992fdc9a6d2c119e1"} Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.474617 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7142-account-create-bmfp7" event={"ID":"36b650a5-7a88-47df-a02f-87dc3eee6f89","Type":"ContainerStarted","Data":"81a82af95aa3408b07afdad3185130390b0aba3a696a2c3a98253a2d6969ea00"} Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.477107 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f81-account-create-66lbl" event={"ID":"3b05baa2-bc07-4801-a944-30ee51acd6c5","Type":"ContainerStarted","Data":"83b64c5e02ef1f3e662badc32279e52bb5de1266e576765efe6ed3ac8baae657"} Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.477139 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f81-account-create-66lbl" event={"ID":"3b05baa2-bc07-4801-a944-30ee51acd6c5","Type":"ContainerStarted","Data":"33a2fd32b3ce547f05604332a539a4030cb4753145d490ed71506c2aaa12f68d"} Oct 10 06:44:43 crc kubenswrapper[4822]: I1010 06:44:43.523652 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-1f81-account-create-66lbl" podStartSLOduration=1.523635093 podStartE2EDuration="1.523635093s" podCreationTimestamp="2025-10-10 06:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:44:43.516237626 +0000 UTC m=+1230.611395832" watchObservedRunningTime="2025-10-10 06:44:43.523635093 +0000 UTC m=+1230.618793289" Oct 10 06:44:44 crc kubenswrapper[4822]: I1010 06:44:44.488627 4822 generic.go:334] "Generic (PLEG): container finished" podID="3b05baa2-bc07-4801-a944-30ee51acd6c5" containerID="83b64c5e02ef1f3e662badc32279e52bb5de1266e576765efe6ed3ac8baae657" exitCode=0 Oct 10 06:44:44 crc kubenswrapper[4822]: I1010 06:44:44.488757 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f81-account-create-66lbl" event={"ID":"3b05baa2-bc07-4801-a944-30ee51acd6c5","Type":"ContainerDied","Data":"83b64c5e02ef1f3e662badc32279e52bb5de1266e576765efe6ed3ac8baae657"} Oct 10 06:44:44 crc kubenswrapper[4822]: I1010 06:44:44.925520 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:44 crc kubenswrapper[4822]: I1010 06:44:44.944648 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.038384 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2gnz\" (UniqueName: \"kubernetes.io/projected/8e6c12c1-b88d-4488-9660-499070fbea2c-kube-api-access-g2gnz\") pod \"8e6c12c1-b88d-4488-9660-499070fbea2c\" (UID: \"8e6c12c1-b88d-4488-9660-499070fbea2c\") " Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.038593 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5hq9\" (UniqueName: \"kubernetes.io/projected/36b650a5-7a88-47df-a02f-87dc3eee6f89-kube-api-access-l5hq9\") pod \"36b650a5-7a88-47df-a02f-87dc3eee6f89\" (UID: \"36b650a5-7a88-47df-a02f-87dc3eee6f89\") " Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.044063 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b650a5-7a88-47df-a02f-87dc3eee6f89-kube-api-access-l5hq9" (OuterVolumeSpecName: "kube-api-access-l5hq9") pod "36b650a5-7a88-47df-a02f-87dc3eee6f89" (UID: "36b650a5-7a88-47df-a02f-87dc3eee6f89"). InnerVolumeSpecName "kube-api-access-l5hq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.044369 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6c12c1-b88d-4488-9660-499070fbea2c-kube-api-access-g2gnz" (OuterVolumeSpecName: "kube-api-access-g2gnz") pod "8e6c12c1-b88d-4488-9660-499070fbea2c" (UID: "8e6c12c1-b88d-4488-9660-499070fbea2c"). InnerVolumeSpecName "kube-api-access-g2gnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.141053 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2gnz\" (UniqueName: \"kubernetes.io/projected/8e6c12c1-b88d-4488-9660-499070fbea2c-kube-api-access-g2gnz\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.142039 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5hq9\" (UniqueName: \"kubernetes.io/projected/36b650a5-7a88-47df-a02f-87dc3eee6f89-kube-api-access-l5hq9\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.498131 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bdb6-account-create-9kt24" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.498118 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bdb6-account-create-9kt24" event={"ID":"8e6c12c1-b88d-4488-9660-499070fbea2c","Type":"ContainerDied","Data":"1de777b88c5217ace5e7a38d390580fb813f096ff63e5ba3602fe257ca6dbc20"} Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.498555 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1de777b88c5217ace5e7a38d390580fb813f096ff63e5ba3602fe257ca6dbc20" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.499569 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7142-account-create-bmfp7" event={"ID":"36b650a5-7a88-47df-a02f-87dc3eee6f89","Type":"ContainerDied","Data":"81a82af95aa3408b07afdad3185130390b0aba3a696a2c3a98253a2d6969ea00"} Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.499612 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81a82af95aa3408b07afdad3185130390b0aba3a696a2c3a98253a2d6969ea00" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.499641 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7142-account-create-bmfp7" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.760055 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.859742 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rp2t\" (UniqueName: \"kubernetes.io/projected/3b05baa2-bc07-4801-a944-30ee51acd6c5-kube-api-access-7rp2t\") pod \"3b05baa2-bc07-4801-a944-30ee51acd6c5\" (UID: \"3b05baa2-bc07-4801-a944-30ee51acd6c5\") " Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.867481 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b05baa2-bc07-4801-a944-30ee51acd6c5-kube-api-access-7rp2t" (OuterVolumeSpecName: "kube-api-access-7rp2t") pod "3b05baa2-bc07-4801-a944-30ee51acd6c5" (UID: "3b05baa2-bc07-4801-a944-30ee51acd6c5"). InnerVolumeSpecName "kube-api-access-7rp2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:45 crc kubenswrapper[4822]: I1010 06:44:45.963703 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rp2t\" (UniqueName: \"kubernetes.io/projected/3b05baa2-bc07-4801-a944-30ee51acd6c5-kube-api-access-7rp2t\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:46 crc kubenswrapper[4822]: I1010 06:44:46.510414 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f81-account-create-66lbl" event={"ID":"3b05baa2-bc07-4801-a944-30ee51acd6c5","Type":"ContainerDied","Data":"33a2fd32b3ce547f05604332a539a4030cb4753145d490ed71506c2aaa12f68d"} Oct 10 06:44:46 crc kubenswrapper[4822]: I1010 06:44:46.510676 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a2fd32b3ce547f05604332a539a4030cb4753145d490ed71506c2aaa12f68d" Oct 10 06:44:46 crc kubenswrapper[4822]: I1010 06:44:46.510519 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f81-account-create-66lbl" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.301624 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jk5j7"] Oct 10 06:44:47 crc kubenswrapper[4822]: E1010 06:44:47.302100 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6c12c1-b88d-4488-9660-499070fbea2c" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.302124 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6c12c1-b88d-4488-9660-499070fbea2c" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: E1010 06:44:47.302148 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b05baa2-bc07-4801-a944-30ee51acd6c5" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.302158 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b05baa2-bc07-4801-a944-30ee51acd6c5" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: E1010 06:44:47.302175 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b650a5-7a88-47df-a02f-87dc3eee6f89" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.302187 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b650a5-7a88-47df-a02f-87dc3eee6f89" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.302411 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b05baa2-bc07-4801-a944-30ee51acd6c5" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.302436 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6c12c1-b88d-4488-9660-499070fbea2c" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.302455 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b650a5-7a88-47df-a02f-87dc3eee6f89" containerName="mariadb-account-create" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.303178 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.310015 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.310075 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.310145 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pxldg" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.326313 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jk5j7"] Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.391919 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-config-data\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.391998 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.392138 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-scripts\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.392164 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn58t\" (UniqueName: \"kubernetes.io/projected/37631965-5ba5-48e5-93a4-b5caa45ac6e5-kube-api-access-nn58t\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.493661 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-config-data\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.494074 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.495017 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-scripts\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.495416 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn58t\" (UniqueName: \"kubernetes.io/projected/37631965-5ba5-48e5-93a4-b5caa45ac6e5-kube-api-access-nn58t\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.499824 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.507473 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-scripts\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.508116 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-config-data\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.527205 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn58t\" (UniqueName: \"kubernetes.io/projected/37631965-5ba5-48e5-93a4-b5caa45ac6e5-kube-api-access-nn58t\") pod \"nova-cell0-conductor-db-sync-jk5j7\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:47 crc kubenswrapper[4822]: I1010 06:44:47.626132 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:44:48 crc kubenswrapper[4822]: W1010 06:44:48.142150 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37631965_5ba5_48e5_93a4_b5caa45ac6e5.slice/crio-d9cacb6581fe8212a542876c9fd53e0934aa4a89f8bb1b04a5634b62e173cc21 WatchSource:0}: Error finding container d9cacb6581fe8212a542876c9fd53e0934aa4a89f8bb1b04a5634b62e173cc21: Status 404 returned error can't find the container with id d9cacb6581fe8212a542876c9fd53e0934aa4a89f8bb1b04a5634b62e173cc21 Oct 10 06:44:48 crc kubenswrapper[4822]: I1010 06:44:48.143523 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jk5j7"] Oct 10 06:44:48 crc kubenswrapper[4822]: I1010 06:44:48.537873 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" event={"ID":"37631965-5ba5-48e5-93a4-b5caa45ac6e5","Type":"ContainerStarted","Data":"d9cacb6581fe8212a542876c9fd53e0934aa4a89f8bb1b04a5634b62e173cc21"} Oct 10 06:44:51 crc kubenswrapper[4822]: I1010 06:44:51.571921 4822 generic.go:334] "Generic (PLEG): container finished" podID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerID="df00ac7cd3316056516aa5090f987440f7eb094e9585dd62542db77e5346e015" exitCode=137 Oct 10 06:44:51 crc kubenswrapper[4822]: I1010 06:44:51.572334 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerDied","Data":"df00ac7cd3316056516aa5090f987440f7eb094e9585dd62542db77e5346e015"} Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.599576 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6b36ea1-cdc2-4db2-b425-8437aed45ec0","Type":"ContainerDied","Data":"6402738bc9c80b40fd580dc154ed4bc0731f2904a9f87efb4a6e372bba5bdb45"} Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.600091 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6402738bc9c80b40fd580dc154ed4bc0731f2904a9f87efb4a6e372bba5bdb45" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.607006 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.729757 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-config-data\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.729914 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx8fm\" (UniqueName: \"kubernetes.io/projected/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-kube-api-access-nx8fm\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.730024 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-log-httpd\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.730082 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-sg-core-conf-yaml\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.730123 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-combined-ca-bundle\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.730198 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-scripts\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.730254 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-run-httpd\") pod \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\" (UID: \"c6b36ea1-cdc2-4db2-b425-8437aed45ec0\") " Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.730622 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.731081 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.731399 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.731417 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.736367 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-scripts" (OuterVolumeSpecName: "scripts") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.736535 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-kube-api-access-nx8fm" (OuterVolumeSpecName: "kube-api-access-nx8fm") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "kube-api-access-nx8fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.759383 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.833731 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.833780 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx8fm\" (UniqueName: \"kubernetes.io/projected/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-kube-api-access-nx8fm\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.833820 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.855687 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-config-data" (OuterVolumeSpecName: "config-data") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.856486 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6b36ea1-cdc2-4db2-b425-8437aed45ec0" (UID: "c6b36ea1-cdc2-4db2-b425-8437aed45ec0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.935134 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:54 crc kubenswrapper[4822]: I1010 06:44:54.935170 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6b36ea1-cdc2-4db2-b425-8437aed45ec0-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.609940 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" event={"ID":"37631965-5ba5-48e5-93a4-b5caa45ac6e5","Type":"ContainerStarted","Data":"c6d5c971df658fefb60ce2392b3efcf4416ced783ebf37b679584284d13e2d5f"} Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.609984 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.635443 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" podStartSLOduration=2.177244413 podStartE2EDuration="8.635427953s" podCreationTimestamp="2025-10-10 06:44:47 +0000 UTC" firstStartedPulling="2025-10-10 06:44:48.144750127 +0000 UTC m=+1235.239908323" lastFinishedPulling="2025-10-10 06:44:54.602933647 +0000 UTC m=+1241.698091863" observedRunningTime="2025-10-10 06:44:55.633178618 +0000 UTC m=+1242.728336814" watchObservedRunningTime="2025-10-10 06:44:55.635427953 +0000 UTC m=+1242.730586149" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.662329 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.662414 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.683669 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:55 crc kubenswrapper[4822]: E1010 06:44:55.684021 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-notification-agent" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684039 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-notification-agent" Oct 10 06:44:55 crc kubenswrapper[4822]: E1010 06:44:55.684057 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-central-agent" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684063 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-central-agent" Oct 10 06:44:55 crc kubenswrapper[4822]: E1010 06:44:55.684094 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="sg-core" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684102 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="sg-core" Oct 10 06:44:55 crc kubenswrapper[4822]: E1010 06:44:55.684112 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="proxy-httpd" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684117 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="proxy-httpd" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684277 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-central-agent" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684286 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="ceilometer-notification-agent" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684297 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="proxy-httpd" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.684310 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" containerName="sg-core" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.686016 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.690075 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.690367 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.696680 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.849617 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.849891 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkrwl\" (UniqueName: \"kubernetes.io/projected/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-kube-api-access-hkrwl\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.849955 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-scripts\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.850240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-log-httpd\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.850313 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-run-httpd\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.850399 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.850559 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-config-data\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952447 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-log-httpd\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952494 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-run-httpd\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952524 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952569 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-config-data\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952607 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952653 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkrwl\" (UniqueName: \"kubernetes.io/projected/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-kube-api-access-hkrwl\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952677 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-scripts\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.952974 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-log-httpd\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.953600 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-run-httpd\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.956666 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-scripts\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.956739 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.956791 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.966571 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-config-data\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:55 crc kubenswrapper[4822]: I1010 06:44:55.974512 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkrwl\" (UniqueName: \"kubernetes.io/projected/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-kube-api-access-hkrwl\") pod \"ceilometer-0\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " pod="openstack/ceilometer-0" Oct 10 06:44:56 crc kubenswrapper[4822]: I1010 06:44:56.000318 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:44:56 crc kubenswrapper[4822]: I1010 06:44:56.443730 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:44:56 crc kubenswrapper[4822]: W1010 06:44:56.445726 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4aefaa1_e38b_48e7_a85e_fe895ccffff7.slice/crio-bf885a2da01ac86eecc4948c17c582c8881d8edc688c51a97c2d15272df09883 WatchSource:0}: Error finding container bf885a2da01ac86eecc4948c17c582c8881d8edc688c51a97c2d15272df09883: Status 404 returned error can't find the container with id bf885a2da01ac86eecc4948c17c582c8881d8edc688c51a97c2d15272df09883 Oct 10 06:44:56 crc kubenswrapper[4822]: I1010 06:44:56.618752 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerStarted","Data":"bf885a2da01ac86eecc4948c17c582c8881d8edc688c51a97c2d15272df09883"} Oct 10 06:44:57 crc kubenswrapper[4822]: I1010 06:44:57.628788 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerStarted","Data":"766e0a9fc31a94d94565d933cf60c64e8bd2d324aa1880600ca608654de47ac4"} Oct 10 06:44:57 crc kubenswrapper[4822]: I1010 06:44:57.667637 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b36ea1-cdc2-4db2-b425-8437aed45ec0" path="/var/lib/kubelet/pods/c6b36ea1-cdc2-4db2-b425-8437aed45ec0/volumes" Oct 10 06:44:59 crc kubenswrapper[4822]: I1010 06:44:59.661441 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerStarted","Data":"7cec3884c79d6eb069131be1249b2666cf250b5993c63941f771d7fa1f2eed0a"} Oct 10 06:44:59 crc kubenswrapper[4822]: I1010 06:44:59.662262 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerStarted","Data":"c01b8d1fde5f0dc00b61198ce9e71f17e6780216e373078e89162b1143c076ea"} Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.152332 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5"] Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.154112 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.156402 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.158581 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.161144 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5"] Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.225856 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7af8772-83f7-4add-b635-3ceda2a642d7-config-volume\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.226116 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2rx\" (UniqueName: \"kubernetes.io/projected/c7af8772-83f7-4add-b635-3ceda2a642d7-kube-api-access-8q2rx\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.226209 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7af8772-83f7-4add-b635-3ceda2a642d7-secret-volume\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.328152 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2rx\" (UniqueName: \"kubernetes.io/projected/c7af8772-83f7-4add-b635-3ceda2a642d7-kube-api-access-8q2rx\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.328230 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7af8772-83f7-4add-b635-3ceda2a642d7-secret-volume\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.328341 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7af8772-83f7-4add-b635-3ceda2a642d7-config-volume\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.329721 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7af8772-83f7-4add-b635-3ceda2a642d7-config-volume\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.332976 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7af8772-83f7-4add-b635-3ceda2a642d7-secret-volume\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.347858 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2rx\" (UniqueName: \"kubernetes.io/projected/c7af8772-83f7-4add-b635-3ceda2a642d7-kube-api-access-8q2rx\") pod \"collect-profiles-29334645-8pgt5\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.492613 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:00 crc kubenswrapper[4822]: I1010 06:45:00.940652 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5"] Oct 10 06:45:00 crc kubenswrapper[4822]: W1010 06:45:00.954133 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7af8772_83f7_4add_b635_3ceda2a642d7.slice/crio-1bcc4144ccb40d79c1048f187b48210bd3f832f5f788ab1fa13abbe3e0a8a400 WatchSource:0}: Error finding container 1bcc4144ccb40d79c1048f187b48210bd3f832f5f788ab1fa13abbe3e0a8a400: Status 404 returned error can't find the container with id 1bcc4144ccb40d79c1048f187b48210bd3f832f5f788ab1fa13abbe3e0a8a400 Oct 10 06:45:01 crc kubenswrapper[4822]: I1010 06:45:01.696044 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerStarted","Data":"a88516a6cab2f92eaa4e0168dae7f3166fc8f1dca16b9387de094589b2128bd8"} Oct 10 06:45:01 crc kubenswrapper[4822]: I1010 06:45:01.696185 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 06:45:01 crc kubenswrapper[4822]: I1010 06:45:01.699446 4822 generic.go:334] "Generic (PLEG): container finished" podID="c7af8772-83f7-4add-b635-3ceda2a642d7" containerID="2b0a4c86aa4a5d5f5e49b7088c533c1387da56f53e88bd7617c63d83ab2dcb9f" exitCode=0 Oct 10 06:45:01 crc kubenswrapper[4822]: I1010 06:45:01.699501 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" event={"ID":"c7af8772-83f7-4add-b635-3ceda2a642d7","Type":"ContainerDied","Data":"2b0a4c86aa4a5d5f5e49b7088c533c1387da56f53e88bd7617c63d83ab2dcb9f"} Oct 10 06:45:01 crc kubenswrapper[4822]: I1010 06:45:01.699532 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" event={"ID":"c7af8772-83f7-4add-b635-3ceda2a642d7","Type":"ContainerStarted","Data":"1bcc4144ccb40d79c1048f187b48210bd3f832f5f788ab1fa13abbe3e0a8a400"} Oct 10 06:45:01 crc kubenswrapper[4822]: I1010 06:45:01.720914 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.427255448 podStartE2EDuration="6.720897311s" podCreationTimestamp="2025-10-10 06:44:55 +0000 UTC" firstStartedPulling="2025-10-10 06:44:56.447875789 +0000 UTC m=+1243.543033985" lastFinishedPulling="2025-10-10 06:45:00.741517652 +0000 UTC m=+1247.836675848" observedRunningTime="2025-10-10 06:45:01.714703882 +0000 UTC m=+1248.809862078" watchObservedRunningTime="2025-10-10 06:45:01.720897311 +0000 UTC m=+1248.816055507" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.039402 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.212968 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q2rx\" (UniqueName: \"kubernetes.io/projected/c7af8772-83f7-4add-b635-3ceda2a642d7-kube-api-access-8q2rx\") pod \"c7af8772-83f7-4add-b635-3ceda2a642d7\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.213128 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7af8772-83f7-4add-b635-3ceda2a642d7-config-volume\") pod \"c7af8772-83f7-4add-b635-3ceda2a642d7\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.213186 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7af8772-83f7-4add-b635-3ceda2a642d7-secret-volume\") pod \"c7af8772-83f7-4add-b635-3ceda2a642d7\" (UID: \"c7af8772-83f7-4add-b635-3ceda2a642d7\") " Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.215340 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7af8772-83f7-4add-b635-3ceda2a642d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7af8772-83f7-4add-b635-3ceda2a642d7" (UID: "c7af8772-83f7-4add-b635-3ceda2a642d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.228635 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7af8772-83f7-4add-b635-3ceda2a642d7-kube-api-access-8q2rx" (OuterVolumeSpecName: "kube-api-access-8q2rx") pod "c7af8772-83f7-4add-b635-3ceda2a642d7" (UID: "c7af8772-83f7-4add-b635-3ceda2a642d7"). InnerVolumeSpecName "kube-api-access-8q2rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.232051 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7af8772-83f7-4add-b635-3ceda2a642d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7af8772-83f7-4add-b635-3ceda2a642d7" (UID: "c7af8772-83f7-4add-b635-3ceda2a642d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.314894 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7af8772-83f7-4add-b635-3ceda2a642d7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.314942 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q2rx\" (UniqueName: \"kubernetes.io/projected/c7af8772-83f7-4add-b635-3ceda2a642d7-kube-api-access-8q2rx\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.314954 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7af8772-83f7-4add-b635-3ceda2a642d7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.720100 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" event={"ID":"c7af8772-83f7-4add-b635-3ceda2a642d7","Type":"ContainerDied","Data":"1bcc4144ccb40d79c1048f187b48210bd3f832f5f788ab1fa13abbe3e0a8a400"} Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.720366 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bcc4144ccb40d79c1048f187b48210bd3f832f5f788ab1fa13abbe3e0a8a400" Oct 10 06:45:03 crc kubenswrapper[4822]: I1010 06:45:03.720165 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5" Oct 10 06:45:10 crc kubenswrapper[4822]: I1010 06:45:10.800668 4822 generic.go:334] "Generic (PLEG): container finished" podID="37631965-5ba5-48e5-93a4-b5caa45ac6e5" containerID="c6d5c971df658fefb60ce2392b3efcf4416ced783ebf37b679584284d13e2d5f" exitCode=0 Oct 10 06:45:10 crc kubenswrapper[4822]: I1010 06:45:10.800780 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" event={"ID":"37631965-5ba5-48e5-93a4-b5caa45ac6e5","Type":"ContainerDied","Data":"c6d5c971df658fefb60ce2392b3efcf4416ced783ebf37b679584284d13e2d5f"} Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.177150 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.282758 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn58t\" (UniqueName: \"kubernetes.io/projected/37631965-5ba5-48e5-93a4-b5caa45ac6e5-kube-api-access-nn58t\") pod \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.282922 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-combined-ca-bundle\") pod \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.283012 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-scripts\") pod \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.283699 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-config-data\") pod \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\" (UID: \"37631965-5ba5-48e5-93a4-b5caa45ac6e5\") " Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.288503 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-scripts" (OuterVolumeSpecName: "scripts") pod "37631965-5ba5-48e5-93a4-b5caa45ac6e5" (UID: "37631965-5ba5-48e5-93a4-b5caa45ac6e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.289851 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37631965-5ba5-48e5-93a4-b5caa45ac6e5-kube-api-access-nn58t" (OuterVolumeSpecName: "kube-api-access-nn58t") pod "37631965-5ba5-48e5-93a4-b5caa45ac6e5" (UID: "37631965-5ba5-48e5-93a4-b5caa45ac6e5"). InnerVolumeSpecName "kube-api-access-nn58t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.315725 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37631965-5ba5-48e5-93a4-b5caa45ac6e5" (UID: "37631965-5ba5-48e5-93a4-b5caa45ac6e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.335054 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-config-data" (OuterVolumeSpecName: "config-data") pod "37631965-5ba5-48e5-93a4-b5caa45ac6e5" (UID: "37631965-5ba5-48e5-93a4-b5caa45ac6e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.385590 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn58t\" (UniqueName: \"kubernetes.io/projected/37631965-5ba5-48e5-93a4-b5caa45ac6e5-kube-api-access-nn58t\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.385623 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.385633 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.385642 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37631965-5ba5-48e5-93a4-b5caa45ac6e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.822528 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" event={"ID":"37631965-5ba5-48e5-93a4-b5caa45ac6e5","Type":"ContainerDied","Data":"d9cacb6581fe8212a542876c9fd53e0934aa4a89f8bb1b04a5634b62e173cc21"} Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.822887 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9cacb6581fe8212a542876c9fd53e0934aa4a89f8bb1b04a5634b62e173cc21" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.822635 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jk5j7" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.922946 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 06:45:12 crc kubenswrapper[4822]: E1010 06:45:12.929582 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7af8772-83f7-4add-b635-3ceda2a642d7" containerName="collect-profiles" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.929644 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7af8772-83f7-4add-b635-3ceda2a642d7" containerName="collect-profiles" Oct 10 06:45:12 crc kubenswrapper[4822]: E1010 06:45:12.929671 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37631965-5ba5-48e5-93a4-b5caa45ac6e5" containerName="nova-cell0-conductor-db-sync" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.929678 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="37631965-5ba5-48e5-93a4-b5caa45ac6e5" containerName="nova-cell0-conductor-db-sync" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.931529 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7af8772-83f7-4add-b635-3ceda2a642d7" containerName="collect-profiles" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.931576 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="37631965-5ba5-48e5-93a4-b5caa45ac6e5" containerName="nova-cell0-conductor-db-sync" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.934066 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.938258 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pxldg" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.939674 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 06:45:12 crc kubenswrapper[4822]: I1010 06:45:12.969112 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.098946 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.098986 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.099057 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v4px\" (UniqueName: \"kubernetes.io/projected/09b24550-0f5f-46ff-bf11-192fa1f15650-kube-api-access-6v4px\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.200633 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.200681 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.200768 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v4px\" (UniqueName: \"kubernetes.io/projected/09b24550-0f5f-46ff-bf11-192fa1f15650-kube-api-access-6v4px\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.206625 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.212584 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.221824 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v4px\" (UniqueName: \"kubernetes.io/projected/09b24550-0f5f-46ff-bf11-192fa1f15650-kube-api-access-6v4px\") pod \"nova-cell0-conductor-0\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.256593 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.702837 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 06:45:13 crc kubenswrapper[4822]: I1010 06:45:13.834851 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"09b24550-0f5f-46ff-bf11-192fa1f15650","Type":"ContainerStarted","Data":"5281d1b387e797afe610f4ac5538b51ce7ca78e96aa44b4af4fb2e1356932856"} Oct 10 06:45:14 crc kubenswrapper[4822]: I1010 06:45:14.850056 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"09b24550-0f5f-46ff-bf11-192fa1f15650","Type":"ContainerStarted","Data":"136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca"} Oct 10 06:45:14 crc kubenswrapper[4822]: I1010 06:45:14.850547 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:14 crc kubenswrapper[4822]: I1010 06:45:14.875068 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.8750328659999997 podStartE2EDuration="2.875032866s" podCreationTimestamp="2025-10-10 06:45:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:14.873053799 +0000 UTC m=+1261.968212065" watchObservedRunningTime="2025-10-10 06:45:14.875032866 +0000 UTC m=+1261.970191152" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.292764 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.730743 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cshfh"] Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.731826 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.734233 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.734281 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.741563 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cshfh"] Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.913145 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h26p\" (UniqueName: \"kubernetes.io/projected/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-kube-api-access-5h26p\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.913241 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-scripts\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.913301 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.913403 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-config-data\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.926625 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.928116 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.930722 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 06:45:23 crc kubenswrapper[4822]: I1010 06:45:23.943652 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.011726 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.013708 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.015195 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-config-data\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.015265 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h26p\" (UniqueName: \"kubernetes.io/projected/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-kube-api-access-5h26p\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.015305 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-scripts\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.015342 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.016419 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.021508 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-scripts\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.026021 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-config-data\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.028387 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.032865 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.047233 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.053205 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.066337 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h26p\" (UniqueName: \"kubernetes.io/projected/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-kube-api-access-5h26p\") pod \"nova-cell0-cell-mapping-cshfh\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.067678 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.115076 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152329 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-config-data\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152377 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152549 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2wzs\" (UniqueName: \"kubernetes.io/projected/536625e7-3444-4b3f-bd94-c9fb49e997fb-kube-api-access-d2wzs\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152583 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-config-data\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152609 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152684 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfbjk\" (UniqueName: \"kubernetes.io/projected/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-kube-api-access-qfbjk\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.152708 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-logs\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.192103 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9cmt4"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.193831 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.217180 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9cmt4"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.235903 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.237295 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.240937 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.261257 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.267761 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.269406 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-config-data\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.269526 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.269643 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.272796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-config\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.273028 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.273189 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-config-data\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.273368 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgh9\" (UniqueName: \"kubernetes.io/projected/fddbc361-4d41-4cab-9baf-aa1dfad98431-kube-api-access-trgh9\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.273472 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.276263 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.276413 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2wzs\" (UniqueName: \"kubernetes.io/projected/536625e7-3444-4b3f-bd94-c9fb49e997fb-kube-api-access-d2wzs\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.276541 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzqqw\" (UniqueName: \"kubernetes.io/projected/980081a1-a752-4d92-94ac-1e789b37e8ec-kube-api-access-gzqqw\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.276701 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-config-data\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.276891 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.277094 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.277233 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfbjk\" (UniqueName: \"kubernetes.io/projected/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-kube-api-access-qfbjk\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.277337 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-logs\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.277450 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjgx5\" (UniqueName: \"kubernetes.io/projected/04f0c0de-cff5-4a04-9b80-204c4791f430-kube-api-access-pjgx5\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.277576 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/980081a1-a752-4d92-94ac-1e789b37e8ec-logs\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.278337 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-logs\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.278424 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.278480 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.279604 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-config-data\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.280137 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-config-data\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.287401 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.300575 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfbjk\" (UniqueName: \"kubernetes.io/projected/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-kube-api-access-qfbjk\") pod \"nova-api-0\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.305210 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2wzs\" (UniqueName: \"kubernetes.io/projected/536625e7-3444-4b3f-bd94-c9fb49e997fb-kube-api-access-d2wzs\") pod \"nova-scheduler-0\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.350250 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.379992 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjgx5\" (UniqueName: \"kubernetes.io/projected/04f0c0de-cff5-4a04-9b80-204c4791f430-kube-api-access-pjgx5\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.380306 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/980081a1-a752-4d92-94ac-1e789b37e8ec-logs\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.380434 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.380546 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.380675 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.380842 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/980081a1-a752-4d92-94ac-1e789b37e8ec-logs\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.380961 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381089 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-config\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381190 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381309 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-config-data\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381447 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trgh9\" (UniqueName: \"kubernetes.io/projected/fddbc361-4d41-4cab-9baf-aa1dfad98431-kube-api-access-trgh9\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381581 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381690 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381694 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzqqw\" (UniqueName: \"kubernetes.io/projected/980081a1-a752-4d92-94ac-1e789b37e8ec-kube-api-access-gzqqw\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.381863 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.383078 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.384267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.384537 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.384638 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.384279 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-config\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.386678 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.386689 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.396120 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-config-data\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.400073 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgh9\" (UniqueName: \"kubernetes.io/projected/fddbc361-4d41-4cab-9baf-aa1dfad98431-kube-api-access-trgh9\") pod \"nova-cell1-novncproxy-0\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.403530 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjgx5\" (UniqueName: \"kubernetes.io/projected/04f0c0de-cff5-4a04-9b80-204c4791f430-kube-api-access-pjgx5\") pod \"dnsmasq-dns-757b4f8459-9cmt4\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.405960 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzqqw\" (UniqueName: \"kubernetes.io/projected/980081a1-a752-4d92-94ac-1e789b37e8ec-kube-api-access-gzqqw\") pod \"nova-metadata-0\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.447716 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.470517 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.522017 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.554445 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.562622 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.812434 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p9vff"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.814035 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.816936 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.818997 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p9vff"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.826492 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.857824 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cshfh"] Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.892916 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rjz2\" (UniqueName: \"kubernetes.io/projected/4bc70e85-31c3-40fd-97ca-3522817405ad-kube-api-access-2rjz2\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.892979 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-scripts\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.893052 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.893102 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-config-data\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.972982 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cshfh" event={"ID":"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3","Type":"ContainerStarted","Data":"3caf603d172efc426382580b37a0e2d859af77d8954f6dd739e82d7b899b5c5f"} Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.994426 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.994499 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-config-data\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.994543 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rjz2\" (UniqueName: \"kubernetes.io/projected/4bc70e85-31c3-40fd-97ca-3522817405ad-kube-api-access-2rjz2\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:24 crc kubenswrapper[4822]: I1010 06:45:24.994579 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-scripts\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.000712 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.018625 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-scripts\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.027718 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-config-data\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.054772 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rjz2\" (UniqueName: \"kubernetes.io/projected/4bc70e85-31c3-40fd-97ca-3522817405ad-kube-api-access-2rjz2\") pod \"nova-cell1-conductor-db-sync-p9vff\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.057469 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:25 crc kubenswrapper[4822]: W1010 06:45:25.060692 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d06dcf6_eff3_4b8a_a9c2_12ac34696221.slice/crio-443266395f0f3afd0393b66b8e5aef910c81fe7b36148b6c329b8bb3d27dae54 WatchSource:0}: Error finding container 443266395f0f3afd0393b66b8e5aef910c81fe7b36148b6c329b8bb3d27dae54: Status 404 returned error can't find the container with id 443266395f0f3afd0393b66b8e5aef910c81fe7b36148b6c329b8bb3d27dae54 Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.075119 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.140220 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.360466 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.369134 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9cmt4"] Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.454468 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:25 crc kubenswrapper[4822]: W1010 06:45:25.455955 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfddbc361_4d41_4cab_9baf_aa1dfad98431.slice/crio-21f6228be49ef6a7badbd525ea1579ccd19b5d371d032867d61bf4dc9930696b WatchSource:0}: Error finding container 21f6228be49ef6a7badbd525ea1579ccd19b5d371d032867d61bf4dc9930696b: Status 404 returned error can't find the container with id 21f6228be49ef6a7badbd525ea1579ccd19b5d371d032867d61bf4dc9930696b Oct 10 06:45:25 crc kubenswrapper[4822]: W1010 06:45:25.626136 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc70e85_31c3_40fd_97ca_3522817405ad.slice/crio-1e3808741512809f0029096ce185f320be1480625002c106af73dd00d1010f79 WatchSource:0}: Error finding container 1e3808741512809f0029096ce185f320be1480625002c106af73dd00d1010f79: Status 404 returned error can't find the container with id 1e3808741512809f0029096ce185f320be1480625002c106af73dd00d1010f79 Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.626422 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p9vff"] Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.991629 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"980081a1-a752-4d92-94ac-1e789b37e8ec","Type":"ContainerStarted","Data":"a3dc61e02a5b14f3bcb604d54ac37e53d2f42e88dc1a5d071b846bb4226b9bd4"} Oct 10 06:45:25 crc kubenswrapper[4822]: I1010 06:45:25.993891 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cshfh" event={"ID":"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3","Type":"ContainerStarted","Data":"9c50f2c0fe0d406dc24edf889d7edf03b140f900ca3e39af887f67f157e22dca"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.010454 4822 generic.go:334] "Generic (PLEG): container finished" podID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerID="72fd1f10fe85a512180703a8a77bc5c119e8cb266dce3be3540c479f736477cb" exitCode=0 Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.010607 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" event={"ID":"04f0c0de-cff5-4a04-9b80-204c4791f430","Type":"ContainerDied","Data":"72fd1f10fe85a512180703a8a77bc5c119e8cb266dce3be3540c479f736477cb"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.010634 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" event={"ID":"04f0c0de-cff5-4a04-9b80-204c4791f430","Type":"ContainerStarted","Data":"a9533b6facc57803587c8a5df3107fb924a3f02438dee24f313c85d755c21758"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.010763 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.011455 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cshfh" podStartSLOduration=3.011437536 podStartE2EDuration="3.011437536s" podCreationTimestamp="2025-10-10 06:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:26.006490453 +0000 UTC m=+1273.101648659" watchObservedRunningTime="2025-10-10 06:45:26.011437536 +0000 UTC m=+1273.106595732" Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.028230 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d06dcf6-eff3-4b8a-a9c2-12ac34696221","Type":"ContainerStarted","Data":"443266395f0f3afd0393b66b8e5aef910c81fe7b36148b6c329b8bb3d27dae54"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.046836 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p9vff" event={"ID":"4bc70e85-31c3-40fd-97ca-3522817405ad","Type":"ContainerStarted","Data":"baea442fa07cf4f678380e3207aef11e9c69e1b72f190fc82c22d6fed63d0641"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.046881 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p9vff" event={"ID":"4bc70e85-31c3-40fd-97ca-3522817405ad","Type":"ContainerStarted","Data":"1e3808741512809f0029096ce185f320be1480625002c106af73dd00d1010f79"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.056774 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"536625e7-3444-4b3f-bd94-c9fb49e997fb","Type":"ContainerStarted","Data":"c1f7116a9c1b49aaae73d02c2fc4fdf8a53cd6774e1272f80a57b4855120384b"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.069304 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fddbc361-4d41-4cab-9baf-aa1dfad98431","Type":"ContainerStarted","Data":"21f6228be49ef6a7badbd525ea1579ccd19b5d371d032867d61bf4dc9930696b"} Oct 10 06:45:26 crc kubenswrapper[4822]: I1010 06:45:26.140665 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-p9vff" podStartSLOduration=2.140617797 podStartE2EDuration="2.140617797s" podCreationTimestamp="2025-10-10 06:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:26.089330942 +0000 UTC m=+1273.184489138" watchObservedRunningTime="2025-10-10 06:45:26.140617797 +0000 UTC m=+1273.235775993" Oct 10 06:45:27 crc kubenswrapper[4822]: I1010 06:45:27.103007 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" event={"ID":"04f0c0de-cff5-4a04-9b80-204c4791f430","Type":"ContainerStarted","Data":"428bf94e5a8032585c4fea0fa4e5da331ef5ad8700f506b032bf5e4e5b727f3d"} Oct 10 06:45:27 crc kubenswrapper[4822]: I1010 06:45:27.103462 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:27 crc kubenswrapper[4822]: I1010 06:45:27.125614 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" podStartSLOduration=3.125591118 podStartE2EDuration="3.125591118s" podCreationTimestamp="2025-10-10 06:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:27.122329123 +0000 UTC m=+1274.217487339" watchObservedRunningTime="2025-10-10 06:45:27.125591118 +0000 UTC m=+1274.220749314" Oct 10 06:45:27 crc kubenswrapper[4822]: I1010 06:45:27.712919 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:27 crc kubenswrapper[4822]: I1010 06:45:27.730839 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.124782 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"536625e7-3444-4b3f-bd94-c9fb49e997fb","Type":"ContainerStarted","Data":"b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a"} Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.130481 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d06dcf6-eff3-4b8a-a9c2-12ac34696221","Type":"ContainerStarted","Data":"a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c"} Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.133522 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fddbc361-4d41-4cab-9baf-aa1dfad98431","Type":"ContainerStarted","Data":"272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3"} Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.133992 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fddbc361-4d41-4cab-9baf-aa1dfad98431" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3" gracePeriod=30 Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.147656 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"980081a1-a752-4d92-94ac-1e789b37e8ec","Type":"ContainerStarted","Data":"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2"} Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.148962 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7888407219999998 podStartE2EDuration="6.148946735s" podCreationTimestamp="2025-10-10 06:45:23 +0000 UTC" firstStartedPulling="2025-10-10 06:45:25.383461363 +0000 UTC m=+1272.478619569" lastFinishedPulling="2025-10-10 06:45:28.743567386 +0000 UTC m=+1275.838725582" observedRunningTime="2025-10-10 06:45:29.143217699 +0000 UTC m=+1276.238375905" watchObservedRunningTime="2025-10-10 06:45:29.148946735 +0000 UTC m=+1276.244104931" Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.169377 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.87478204 podStartE2EDuration="5.169361696s" podCreationTimestamp="2025-10-10 06:45:24 +0000 UTC" firstStartedPulling="2025-10-10 06:45:25.458089304 +0000 UTC m=+1272.553247500" lastFinishedPulling="2025-10-10 06:45:28.75266896 +0000 UTC m=+1275.847827156" observedRunningTime="2025-10-10 06:45:29.1636226 +0000 UTC m=+1276.258780796" watchObservedRunningTime="2025-10-10 06:45:29.169361696 +0000 UTC m=+1276.264519882" Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.555558 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 06:45:29 crc kubenswrapper[4822]: I1010 06:45:29.563794 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.155832 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"980081a1-a752-4d92-94ac-1e789b37e8ec","Type":"ContainerStarted","Data":"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16"} Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.155958 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-log" containerID="cri-o://4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2" gracePeriod=30 Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.156060 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-metadata" containerID="cri-o://60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16" gracePeriod=30 Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.162986 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d06dcf6-eff3-4b8a-a9c2-12ac34696221","Type":"ContainerStarted","Data":"4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774"} Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.177044 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.5122524779999997 podStartE2EDuration="7.177020103s" podCreationTimestamp="2025-10-10 06:45:23 +0000 UTC" firstStartedPulling="2025-10-10 06:45:25.079636676 +0000 UTC m=+1272.174794872" lastFinishedPulling="2025-10-10 06:45:28.744404301 +0000 UTC m=+1275.839562497" observedRunningTime="2025-10-10 06:45:30.176478238 +0000 UTC m=+1277.271636444" watchObservedRunningTime="2025-10-10 06:45:30.177020103 +0000 UTC m=+1277.272178309" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.202776 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.521422835 podStartE2EDuration="7.202751138s" podCreationTimestamp="2025-10-10 06:45:23 +0000 UTC" firstStartedPulling="2025-10-10 06:45:25.063032736 +0000 UTC m=+1272.158190942" lastFinishedPulling="2025-10-10 06:45:28.744361049 +0000 UTC m=+1275.839519245" observedRunningTime="2025-10-10 06:45:30.194636293 +0000 UTC m=+1277.289794499" watchObservedRunningTime="2025-10-10 06:45:30.202751138 +0000 UTC m=+1277.297909334" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.373851 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.374103 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ca7c7319-b717-4031-b211-5dfa1c501003" containerName="kube-state-metrics" containerID="cri-o://81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b" gracePeriod=30 Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.712095 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.847098 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzqqw\" (UniqueName: \"kubernetes.io/projected/980081a1-a752-4d92-94ac-1e789b37e8ec-kube-api-access-gzqqw\") pod \"980081a1-a752-4d92-94ac-1e789b37e8ec\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.847298 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-config-data\") pod \"980081a1-a752-4d92-94ac-1e789b37e8ec\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.847340 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/980081a1-a752-4d92-94ac-1e789b37e8ec-logs\") pod \"980081a1-a752-4d92-94ac-1e789b37e8ec\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.847380 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-combined-ca-bundle\") pod \"980081a1-a752-4d92-94ac-1e789b37e8ec\" (UID: \"980081a1-a752-4d92-94ac-1e789b37e8ec\") " Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.849485 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/980081a1-a752-4d92-94ac-1e789b37e8ec-logs" (OuterVolumeSpecName: "logs") pod "980081a1-a752-4d92-94ac-1e789b37e8ec" (UID: "980081a1-a752-4d92-94ac-1e789b37e8ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.856103 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980081a1-a752-4d92-94ac-1e789b37e8ec-kube-api-access-gzqqw" (OuterVolumeSpecName: "kube-api-access-gzqqw") pod "980081a1-a752-4d92-94ac-1e789b37e8ec" (UID: "980081a1-a752-4d92-94ac-1e789b37e8ec"). InnerVolumeSpecName "kube-api-access-gzqqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.890183 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-config-data" (OuterVolumeSpecName: "config-data") pod "980081a1-a752-4d92-94ac-1e789b37e8ec" (UID: "980081a1-a752-4d92-94ac-1e789b37e8ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.892960 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "980081a1-a752-4d92-94ac-1e789b37e8ec" (UID: "980081a1-a752-4d92-94ac-1e789b37e8ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.950128 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzqqw\" (UniqueName: \"kubernetes.io/projected/980081a1-a752-4d92-94ac-1e789b37e8ec-kube-api-access-gzqqw\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.950166 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.950176 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/980081a1-a752-4d92-94ac-1e789b37e8ec-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.950185 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/980081a1-a752-4d92-94ac-1e789b37e8ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:30 crc kubenswrapper[4822]: I1010 06:45:30.973143 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.051109 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm5tn\" (UniqueName: \"kubernetes.io/projected/ca7c7319-b717-4031-b211-5dfa1c501003-kube-api-access-wm5tn\") pod \"ca7c7319-b717-4031-b211-5dfa1c501003\" (UID: \"ca7c7319-b717-4031-b211-5dfa1c501003\") " Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.064990 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7c7319-b717-4031-b211-5dfa1c501003-kube-api-access-wm5tn" (OuterVolumeSpecName: "kube-api-access-wm5tn") pod "ca7c7319-b717-4031-b211-5dfa1c501003" (UID: "ca7c7319-b717-4031-b211-5dfa1c501003"). InnerVolumeSpecName "kube-api-access-wm5tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.153730 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm5tn\" (UniqueName: \"kubernetes.io/projected/ca7c7319-b717-4031-b211-5dfa1c501003-kube-api-access-wm5tn\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.172508 4822 generic.go:334] "Generic (PLEG): container finished" podID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerID="60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16" exitCode=0 Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.172537 4822 generic.go:334] "Generic (PLEG): container finished" podID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerID="4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2" exitCode=143 Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.172557 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.172550 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"980081a1-a752-4d92-94ac-1e789b37e8ec","Type":"ContainerDied","Data":"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16"} Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.173177 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"980081a1-a752-4d92-94ac-1e789b37e8ec","Type":"ContainerDied","Data":"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2"} Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.173196 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"980081a1-a752-4d92-94ac-1e789b37e8ec","Type":"ContainerDied","Data":"a3dc61e02a5b14f3bcb604d54ac37e53d2f42e88dc1a5d071b846bb4226b9bd4"} Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.173216 4822 scope.go:117] "RemoveContainer" containerID="60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.174939 4822 generic.go:334] "Generic (PLEG): container finished" podID="ca7c7319-b717-4031-b211-5dfa1c501003" containerID="81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b" exitCode=2 Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.175066 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ca7c7319-b717-4031-b211-5dfa1c501003","Type":"ContainerDied","Data":"81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b"} Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.176142 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ca7c7319-b717-4031-b211-5dfa1c501003","Type":"ContainerDied","Data":"7222b4422ec7269c0efea74e8ff34b68cd70102eb95b90cededb1242f90f5f6c"} Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.175129 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.221785 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.228124 4822 scope.go:117] "RemoveContainer" containerID="4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.238442 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.261180 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.283722 4822 scope.go:117] "RemoveContainer" containerID="60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16" Oct 10 06:45:31 crc kubenswrapper[4822]: E1010 06:45:31.284422 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16\": container with ID starting with 60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16 not found: ID does not exist" containerID="60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.284458 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16"} err="failed to get container status \"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16\": rpc error: code = NotFound desc = could not find container \"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16\": container with ID starting with 60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16 not found: ID does not exist" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.284488 4822 scope.go:117] "RemoveContainer" containerID="4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2" Oct 10 06:45:31 crc kubenswrapper[4822]: E1010 06:45:31.285379 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2\": container with ID starting with 4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2 not found: ID does not exist" containerID="4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.285404 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2"} err="failed to get container status \"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2\": rpc error: code = NotFound desc = could not find container \"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2\": container with ID starting with 4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2 not found: ID does not exist" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.285423 4822 scope.go:117] "RemoveContainer" containerID="60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.285784 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16"} err="failed to get container status \"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16\": rpc error: code = NotFound desc = could not find container \"60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16\": container with ID starting with 60350c62abfe19159e61078880343550da70b154c675484285c6fd7ce72a8b16 not found: ID does not exist" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.285824 4822 scope.go:117] "RemoveContainer" containerID="4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.286173 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2"} err="failed to get container status \"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2\": rpc error: code = NotFound desc = could not find container \"4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2\": container with ID starting with 4282eaa7d220d6adedf0d96478f843620f335b78036a25d5dafe577f00f60ff2 not found: ID does not exist" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.286197 4822 scope.go:117] "RemoveContainer" containerID="81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.292894 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.308370 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: E1010 06:45:31.308852 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-metadata" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.308869 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-metadata" Oct 10 06:45:31 crc kubenswrapper[4822]: E1010 06:45:31.308891 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-log" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.308899 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-log" Oct 10 06:45:31 crc kubenswrapper[4822]: E1010 06:45:31.308925 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7c7319-b717-4031-b211-5dfa1c501003" containerName="kube-state-metrics" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.308934 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7c7319-b717-4031-b211-5dfa1c501003" containerName="kube-state-metrics" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.309169 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7c7319-b717-4031-b211-5dfa1c501003" containerName="kube-state-metrics" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.309210 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-metadata" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.309232 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" containerName="nova-metadata-log" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.310457 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.316776 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.322076 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.323464 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.323649 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.332922 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.333212 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.333218 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.356204 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.357154 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-config-data\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.357266 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qdvx\" (UniqueName: \"kubernetes.io/projected/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-kube-api-access-7qdvx\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.357306 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.357325 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-logs\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.357362 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.386124 4822 scope.go:117] "RemoveContainer" containerID="81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b" Oct 10 06:45:31 crc kubenswrapper[4822]: E1010 06:45:31.390780 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b\": container with ID starting with 81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b not found: ID does not exist" containerID="81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.390968 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b"} err="failed to get container status \"81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b\": rpc error: code = NotFound desc = could not find container \"81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b\": container with ID starting with 81f23848514368e71c22e71056481cc3fb09f175d948bd9c0a01fc10b49b0b1b not found: ID does not exist" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.458845 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qdvx\" (UniqueName: \"kubernetes.io/projected/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-kube-api-access-7qdvx\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.459260 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.459645 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.460456 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-logs\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.460666 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.460876 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-logs\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.461413 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-config-data\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.461548 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqcr\" (UniqueName: \"kubernetes.io/projected/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-api-access-qcqcr\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.461703 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.461898 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.464217 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.464993 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.466555 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-config-data\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.474519 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qdvx\" (UniqueName: \"kubernetes.io/projected/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-kube-api-access-7qdvx\") pod \"nova-metadata-0\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.563478 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.563594 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqcr\" (UniqueName: \"kubernetes.io/projected/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-api-access-qcqcr\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.563620 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.563654 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.567573 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.568358 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.569424 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.579298 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqcr\" (UniqueName: \"kubernetes.io/projected/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-api-access-qcqcr\") pod \"kube-state-metrics-0\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " pod="openstack/kube-state-metrics-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.656246 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.664560 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980081a1-a752-4d92-94ac-1e789b37e8ec" path="/var/lib/kubelet/pods/980081a1-a752-4d92-94ac-1e789b37e8ec/volumes" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.665135 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca7c7319-b717-4031-b211-5dfa1c501003" path="/var/lib/kubelet/pods/ca7c7319-b717-4031-b211-5dfa1c501003/volumes" Oct 10 06:45:31 crc kubenswrapper[4822]: I1010 06:45:31.686318 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.184989 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:32 crc kubenswrapper[4822]: W1010 06:45:32.249649 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb8479c_8058_4a41_9bc7_8fd09bd321d8.slice/crio-6ab5b78cb60f7f37682f2e378fd2956db10b4657a8c73cd19534becf33df569a WatchSource:0}: Error finding container 6ab5b78cb60f7f37682f2e378fd2956db10b4657a8c73cd19534becf33df569a: Status 404 returned error can't find the container with id 6ab5b78cb60f7f37682f2e378fd2956db10b4657a8c73cd19534becf33df569a Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.249729 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.517250 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.517888 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-central-agent" containerID="cri-o://766e0a9fc31a94d94565d933cf60c64e8bd2d324aa1880600ca608654de47ac4" gracePeriod=30 Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.518016 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="sg-core" containerID="cri-o://7cec3884c79d6eb069131be1249b2666cf250b5993c63941f771d7fa1f2eed0a" gracePeriod=30 Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.518194 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="proxy-httpd" containerID="cri-o://a88516a6cab2f92eaa4e0168dae7f3166fc8f1dca16b9387de094589b2128bd8" gracePeriod=30 Oct 10 06:45:32 crc kubenswrapper[4822]: I1010 06:45:32.518033 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-notification-agent" containerID="cri-o://c01b8d1fde5f0dc00b61198ce9e71f17e6780216e373078e89162b1143c076ea" gracePeriod=30 Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.211286 4822 generic.go:334] "Generic (PLEG): container finished" podID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerID="a88516a6cab2f92eaa4e0168dae7f3166fc8f1dca16b9387de094589b2128bd8" exitCode=0 Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.211693 4822 generic.go:334] "Generic (PLEG): container finished" podID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerID="7cec3884c79d6eb069131be1249b2666cf250b5993c63941f771d7fa1f2eed0a" exitCode=2 Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.211708 4822 generic.go:334] "Generic (PLEG): container finished" podID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerID="766e0a9fc31a94d94565d933cf60c64e8bd2d324aa1880600ca608654de47ac4" exitCode=0 Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.211361 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerDied","Data":"a88516a6cab2f92eaa4e0168dae7f3166fc8f1dca16b9387de094589b2128bd8"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.211788 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerDied","Data":"7cec3884c79d6eb069131be1249b2666cf250b5993c63941f771d7fa1f2eed0a"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.211826 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerDied","Data":"766e0a9fc31a94d94565d933cf60c64e8bd2d324aa1880600ca608654de47ac4"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.214612 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"afb8479c-8058-4a41-9bc7-8fd09bd321d8","Type":"ContainerStarted","Data":"c7baa3dcfc861f3f0c1cd8711d2244d5d331b16f63e69a27aa93f37eaaee5a7d"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.214662 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"afb8479c-8058-4a41-9bc7-8fd09bd321d8","Type":"ContainerStarted","Data":"6ab5b78cb60f7f37682f2e378fd2956db10b4657a8c73cd19534becf33df569a"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.214730 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.217919 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5","Type":"ContainerStarted","Data":"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.218042 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5","Type":"ContainerStarted","Data":"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.218137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5","Type":"ContainerStarted","Data":"0790b2fa95a554b73c34cbb7914d4c7363a5fd8f198e606d5701d49ba5127efe"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.219427 4822 generic.go:334] "Generic (PLEG): container finished" podID="2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" containerID="9c50f2c0fe0d406dc24edf889d7edf03b140f900ca3e39af887f67f157e22dca" exitCode=0 Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.219546 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cshfh" event={"ID":"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3","Type":"ContainerDied","Data":"9c50f2c0fe0d406dc24edf889d7edf03b140f900ca3e39af887f67f157e22dca"} Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.229849 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.637583351 podStartE2EDuration="2.229829179s" podCreationTimestamp="2025-10-10 06:45:31 +0000 UTC" firstStartedPulling="2025-10-10 06:45:32.254612762 +0000 UTC m=+1279.349770958" lastFinishedPulling="2025-10-10 06:45:32.84685859 +0000 UTC m=+1279.942016786" observedRunningTime="2025-10-10 06:45:33.227343967 +0000 UTC m=+1280.322502163" watchObservedRunningTime="2025-10-10 06:45:33.229829179 +0000 UTC m=+1280.324987375" Oct 10 06:45:33 crc kubenswrapper[4822]: I1010 06:45:33.260130 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.260103755 podStartE2EDuration="2.260103755s" podCreationTimestamp="2025-10-10 06:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:33.257001066 +0000 UTC m=+1280.352159282" watchObservedRunningTime="2025-10-10 06:45:33.260103755 +0000 UTC m=+1280.355261961" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.449285 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.449653 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.524949 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.559905 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.629119 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9lj64"] Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.629754 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerName="dnsmasq-dns" containerID="cri-o://d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b" gracePeriod=10 Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.635451 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.636530 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.730292 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-config-data\") pod \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.730390 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-combined-ca-bundle\") pod \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.730464 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h26p\" (UniqueName: \"kubernetes.io/projected/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-kube-api-access-5h26p\") pod \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.730582 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-scripts\") pod \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\" (UID: \"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3\") " Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.743743 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-scripts" (OuterVolumeSpecName: "scripts") pod "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" (UID: "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.746320 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-kube-api-access-5h26p" (OuterVolumeSpecName: "kube-api-access-5h26p") pod "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" (UID: "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3"). InnerVolumeSpecName "kube-api-access-5h26p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.769091 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-config-data" (OuterVolumeSpecName: "config-data") pod "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" (UID: "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.772439 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" (UID: "2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.833068 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.833120 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.833136 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:34 crc kubenswrapper[4822]: I1010 06:45:34.833152 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h26p\" (UniqueName: \"kubernetes.io/projected/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3-kube-api-access-5h26p\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.176492 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.242184 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-svc\") pod \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.242251 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-config\") pod \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.242361 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-nb\") pod \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.242443 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqmwb\" (UniqueName: \"kubernetes.io/projected/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-kube-api-access-rqmwb\") pod \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.242476 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-swift-storage-0\") pod \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.242502 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-sb\") pod \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\" (UID: \"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.246053 4822 generic.go:334] "Generic (PLEG): container finished" podID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerID="c01b8d1fde5f0dc00b61198ce9e71f17e6780216e373078e89162b1143c076ea" exitCode=0 Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.246140 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerDied","Data":"c01b8d1fde5f0dc00b61198ce9e71f17e6780216e373078e89162b1143c076ea"} Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.248405 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cshfh" event={"ID":"2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3","Type":"ContainerDied","Data":"3caf603d172efc426382580b37a0e2d859af77d8954f6dd739e82d7b899b5c5f"} Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.248442 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3caf603d172efc426382580b37a0e2d859af77d8954f6dd739e82d7b899b5c5f" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.248501 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cshfh" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.251454 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-kube-api-access-rqmwb" (OuterVolumeSpecName: "kube-api-access-rqmwb") pod "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" (UID: "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5"). InnerVolumeSpecName "kube-api-access-rqmwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.253846 4822 generic.go:334] "Generic (PLEG): container finished" podID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerID="d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b" exitCode=0 Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.255537 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.256334 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" event={"ID":"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5","Type":"ContainerDied","Data":"d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b"} Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.256378 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9lj64" event={"ID":"3f3c317f-a6bf-4d0b-8825-00eaa8a878f5","Type":"ContainerDied","Data":"a60bf2a6a25865761ebf409822e2df24282448121ae7ffeabe9b4e4bef3943e3"} Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.256398 4822 scope.go:117] "RemoveContainer" containerID="d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.298323 4822 scope.go:117] "RemoveContainer" containerID="8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.299823 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" (UID: "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.301047 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.318704 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" (UID: "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.328221 4822 scope.go:117] "RemoveContainer" containerID="d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.328585 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" (UID: "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: E1010 06:45:35.329020 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b\": container with ID starting with d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b not found: ID does not exist" containerID="d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.329057 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b"} err="failed to get container status \"d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b\": rpc error: code = NotFound desc = could not find container \"d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b\": container with ID starting with d2a6e310a1951c6d3871172e772a8679141561de61e8f51bbb9516cee6b82d5b not found: ID does not exist" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.329087 4822 scope.go:117] "RemoveContainer" containerID="8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28" Oct 10 06:45:35 crc kubenswrapper[4822]: E1010 06:45:35.329343 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28\": container with ID starting with 8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28 not found: ID does not exist" containerID="8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.329376 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28"} err="failed to get container status \"8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28\": rpc error: code = NotFound desc = could not find container \"8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28\": container with ID starting with 8f40ad85e5fab6441dd87a5daa90f19d2577e7d771596fb098d68b46c2c97d28 not found: ID does not exist" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.338484 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-config" (OuterVolumeSpecName: "config") pod "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" (UID: "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.345024 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.345070 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.345082 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.345096 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqmwb\" (UniqueName: \"kubernetes.io/projected/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-kube-api-access-rqmwb\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.345111 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.352300 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" (UID: "3f3c317f-a6bf-4d0b-8825-00eaa8a878f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.386416 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.386859 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-log" containerID="cri-o://a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c" gracePeriod=30 Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.387273 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-api" containerID="cri-o://4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774" gracePeriod=30 Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.404447 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": EOF" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.409500 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": EOF" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.439331 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.439829 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-log" containerID="cri-o://301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2" gracePeriod=30 Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.439879 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-metadata" containerID="cri-o://327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f" gracePeriod=30 Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.446714 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.649763 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9lj64"] Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.686017 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9lj64"] Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.838710 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.927487 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.955108 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-config-data\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.956914 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-combined-ca-bundle\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.957080 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-log-httpd\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.957189 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-sg-core-conf-yaml\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.957724 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkrwl\" (UniqueName: \"kubernetes.io/projected/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-kube-api-access-hkrwl\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.957920 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-run-httpd\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.958083 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-scripts\") pod \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\" (UID: \"e4aefaa1-e38b-48e7-a85e-fe895ccffff7\") " Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.956304 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.960648 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.960908 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.964027 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-kube-api-access-hkrwl" (OuterVolumeSpecName: "kube-api-access-hkrwl") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "kube-api-access-hkrwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:35 crc kubenswrapper[4822]: I1010 06:45:35.965234 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-scripts" (OuterVolumeSpecName: "scripts") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.006986 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.052135 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.059586 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-combined-ca-bundle\") pod \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.060258 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-logs\") pod \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.060447 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-nova-metadata-tls-certs\") pod \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.060606 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qdvx\" (UniqueName: \"kubernetes.io/projected/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-kube-api-access-7qdvx\") pod \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.061591 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-logs" (OuterVolumeSpecName: "logs") pod "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" (UID: "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.061753 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-config-data\") pod \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\" (UID: \"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5\") " Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.062693 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.062951 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.063051 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.063127 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.063196 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkrwl\" (UniqueName: \"kubernetes.io/projected/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-kube-api-access-hkrwl\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.063275 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.063516 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.066337 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-kube-api-access-7qdvx" (OuterVolumeSpecName: "kube-api-access-7qdvx") pod "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" (UID: "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5"). InnerVolumeSpecName "kube-api-access-7qdvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.092637 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-config-data" (OuterVolumeSpecName: "config-data") pod "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" (UID: "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.097094 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" (UID: "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.110328 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-config-data" (OuterVolumeSpecName: "config-data") pod "e4aefaa1-e38b-48e7-a85e-fe895ccffff7" (UID: "e4aefaa1-e38b-48e7-a85e-fe895ccffff7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.122152 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" (UID: "ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.167665 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.167717 4822 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.167727 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qdvx\" (UniqueName: \"kubernetes.io/projected/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-kube-api-access-7qdvx\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.167736 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.167745 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4aefaa1-e38b-48e7-a85e-fe895ccffff7-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.266998 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4aefaa1-e38b-48e7-a85e-fe895ccffff7","Type":"ContainerDied","Data":"bf885a2da01ac86eecc4948c17c582c8881d8edc688c51a97c2d15272df09883"} Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.267057 4822 scope.go:117] "RemoveContainer" containerID="a88516a6cab2f92eaa4e0168dae7f3166fc8f1dca16b9387de094589b2128bd8" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.267198 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.275084 4822 generic.go:334] "Generic (PLEG): container finished" podID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerID="327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f" exitCode=0 Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.275132 4822 generic.go:334] "Generic (PLEG): container finished" podID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerID="301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2" exitCode=143 Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.275201 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5","Type":"ContainerDied","Data":"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f"} Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.275210 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.275238 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5","Type":"ContainerDied","Data":"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2"} Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.275253 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5","Type":"ContainerDied","Data":"0790b2fa95a554b73c34cbb7914d4c7363a5fd8f198e606d5701d49ba5127efe"} Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.284872 4822 generic.go:334] "Generic (PLEG): container finished" podID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerID="a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c" exitCode=143 Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.284961 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d06dcf6-eff3-4b8a-a9c2-12ac34696221","Type":"ContainerDied","Data":"a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c"} Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.302490 4822 scope.go:117] "RemoveContainer" containerID="7cec3884c79d6eb069131be1249b2666cf250b5993c63941f771d7fa1f2eed0a" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.314022 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.342033 4822 scope.go:117] "RemoveContainer" containerID="c01b8d1fde5f0dc00b61198ce9e71f17e6780216e373078e89162b1143c076ea" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.342363 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.354671 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355224 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-notification-agent" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355239 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-notification-agent" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355249 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-central-agent" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355257 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-central-agent" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355269 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerName="init" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355277 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerName="init" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355292 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerName="dnsmasq-dns" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355299 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerName="dnsmasq-dns" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355310 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="proxy-httpd" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355316 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="proxy-httpd" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355330 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="sg-core" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355338 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="sg-core" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355353 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-metadata" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355362 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-metadata" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355378 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" containerName="nova-manage" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355384 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" containerName="nova-manage" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.355392 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-log" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355398 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-log" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355591 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="sg-core" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355607 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-notification-agent" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355621 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="ceilometer-central-agent" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355630 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" containerName="dnsmasq-dns" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355639 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" containerName="proxy-httpd" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355648 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-log" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355661 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" containerName="nova-metadata-metadata" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.355668 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" containerName="nova-manage" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.357515 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.366522 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.369973 4822 scope.go:117] "RemoveContainer" containerID="766e0a9fc31a94d94565d933cf60c64e8bd2d324aa1880600ca608654de47ac4" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.376171 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.399503 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.399873 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.423340 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.425507 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.461328 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.462972 4822 scope.go:117] "RemoveContainer" containerID="327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.463883 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.466743 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.467003 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.477943 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qqf\" (UniqueName: \"kubernetes.io/projected/93b47efc-6ac4-4e55-a01c-a865fd89abe8-kube-api-access-h4qqf\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.477997 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-scripts\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.478067 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.478090 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-config-data\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.478157 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.478189 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.478215 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-run-httpd\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.478246 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-log-httpd\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.485491 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.498929 4822 scope.go:117] "RemoveContainer" containerID="301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.580481 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.580793 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qqf\" (UniqueName: \"kubernetes.io/projected/93b47efc-6ac4-4e55-a01c-a865fd89abe8-kube-api-access-h4qqf\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.580954 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-scripts\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.581069 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-config-data\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.581209 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.581321 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dfj\" (UniqueName: \"kubernetes.io/projected/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-kube-api-access-92dfj\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.581461 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.581557 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-config-data\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.582079 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.582175 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-logs\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.582262 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.582343 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-run-httpd\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.582442 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-log-httpd\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.583988 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-run-httpd\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.584159 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-log-httpd\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.587513 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.587552 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-scripts\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.587671 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-config-data\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.597908 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.599032 4822 scope.go:117] "RemoveContainer" containerID="327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.599618 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.599693 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f\": container with ID starting with 327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f not found: ID does not exist" containerID="327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.599737 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f"} err="failed to get container status \"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f\": rpc error: code = NotFound desc = could not find container \"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f\": container with ID starting with 327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f not found: ID does not exist" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.599773 4822 scope.go:117] "RemoveContainer" containerID="301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2" Oct 10 06:45:36 crc kubenswrapper[4822]: E1010 06:45:36.600114 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2\": container with ID starting with 301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2 not found: ID does not exist" containerID="301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.600153 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2"} err="failed to get container status \"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2\": rpc error: code = NotFound desc = could not find container \"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2\": container with ID starting with 301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2 not found: ID does not exist" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.600182 4822 scope.go:117] "RemoveContainer" containerID="327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.600451 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f"} err="failed to get container status \"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f\": rpc error: code = NotFound desc = could not find container \"327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f\": container with ID starting with 327609c5f099728c8efe3d11fe359a18d614d701bc0b7208c71a4023d75bf61f not found: ID does not exist" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.600477 4822 scope.go:117] "RemoveContainer" containerID="301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.600708 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2"} err="failed to get container status \"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2\": rpc error: code = NotFound desc = could not find container \"301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2\": container with ID starting with 301ef4918c31b140b9c62d5308625b32dbd2403f45e6067cd5e282f35b5989b2 not found: ID does not exist" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.603421 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qqf\" (UniqueName: \"kubernetes.io/projected/93b47efc-6ac4-4e55-a01c-a865fd89abe8-kube-api-access-h4qqf\") pod \"ceilometer-0\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.685073 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.686517 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-config-data\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.686711 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.687342 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92dfj\" (UniqueName: \"kubernetes.io/projected/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-kube-api-access-92dfj\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.687610 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-logs\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.688079 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-logs\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.690283 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.690816 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-config-data\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.690945 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.707106 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dfj\" (UniqueName: \"kubernetes.io/projected/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-kube-api-access-92dfj\") pod \"nova-metadata-0\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " pod="openstack/nova-metadata-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.747894 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:45:36 crc kubenswrapper[4822]: I1010 06:45:36.804060 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.215143 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.304045 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="536625e7-3444-4b3f-bd94-c9fb49e997fb" containerName="nova-scheduler-scheduler" containerID="cri-o://b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a" gracePeriod=30 Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.304400 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerStarted","Data":"e596c8f429f98c6fcb09b8eb4c72f72ab798200f99a951d5fbed9b47108db75d"} Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.330759 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:45:37 crc kubenswrapper[4822]: W1010 06:45:37.345519 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4e7a6a_d6b8_4c0d_ab35_4d0396a33c79.slice/crio-c6f5be1d7e20090d364ec469b79bfb138fc170695c6c1a67f6223ab0f4fc7abc WatchSource:0}: Error finding container c6f5be1d7e20090d364ec469b79bfb138fc170695c6c1a67f6223ab0f4fc7abc: Status 404 returned error can't find the container with id c6f5be1d7e20090d364ec469b79bfb138fc170695c6c1a67f6223ab0f4fc7abc Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.660757 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3c317f-a6bf-4d0b-8825-00eaa8a878f5" path="/var/lib/kubelet/pods/3f3c317f-a6bf-4d0b-8825-00eaa8a878f5/volumes" Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.662025 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4aefaa1-e38b-48e7-a85e-fe895ccffff7" path="/var/lib/kubelet/pods/e4aefaa1-e38b-48e7-a85e-fe895ccffff7/volumes" Oct 10 06:45:37 crc kubenswrapper[4822]: I1010 06:45:37.662904 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5" path="/var/lib/kubelet/pods/ec8fd7f1-77dd-4e07-bda7-ecb0933d01f5/volumes" Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.314749 4822 generic.go:334] "Generic (PLEG): container finished" podID="4bc70e85-31c3-40fd-97ca-3522817405ad" containerID="baea442fa07cf4f678380e3207aef11e9c69e1b72f190fc82c22d6fed63d0641" exitCode=0 Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.314847 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p9vff" event={"ID":"4bc70e85-31c3-40fd-97ca-3522817405ad","Type":"ContainerDied","Data":"baea442fa07cf4f678380e3207aef11e9c69e1b72f190fc82c22d6fed63d0641"} Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.318615 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79","Type":"ContainerStarted","Data":"225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4"} Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.318955 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79","Type":"ContainerStarted","Data":"a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743"} Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.318981 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79","Type":"ContainerStarted","Data":"c6f5be1d7e20090d364ec469b79bfb138fc170695c6c1a67f6223ab0f4fc7abc"} Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.320722 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerStarted","Data":"f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929"} Oct 10 06:45:38 crc kubenswrapper[4822]: I1010 06:45:38.353817 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.353783136 podStartE2EDuration="2.353783136s" podCreationTimestamp="2025-10-10 06:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:38.352988973 +0000 UTC m=+1285.448147179" watchObservedRunningTime="2025-10-10 06:45:38.353783136 +0000 UTC m=+1285.448941332" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.334864 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerStarted","Data":"775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49"} Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.335230 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerStarted","Data":"73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706"} Oct 10 06:45:39 crc kubenswrapper[4822]: E1010 06:45:39.557683 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 06:45:39 crc kubenswrapper[4822]: E1010 06:45:39.560647 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 06:45:39 crc kubenswrapper[4822]: E1010 06:45:39.562304 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 06:45:39 crc kubenswrapper[4822]: E1010 06:45:39.562355 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="536625e7-3444-4b3f-bd94-c9fb49e997fb" containerName="nova-scheduler-scheduler" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.704821 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.739622 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-scripts\") pod \"4bc70e85-31c3-40fd-97ca-3522817405ad\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.739876 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rjz2\" (UniqueName: \"kubernetes.io/projected/4bc70e85-31c3-40fd-97ca-3522817405ad-kube-api-access-2rjz2\") pod \"4bc70e85-31c3-40fd-97ca-3522817405ad\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.740008 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-combined-ca-bundle\") pod \"4bc70e85-31c3-40fd-97ca-3522817405ad\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.740060 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-config-data\") pod \"4bc70e85-31c3-40fd-97ca-3522817405ad\" (UID: \"4bc70e85-31c3-40fd-97ca-3522817405ad\") " Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.747366 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-scripts" (OuterVolumeSpecName: "scripts") pod "4bc70e85-31c3-40fd-97ca-3522817405ad" (UID: "4bc70e85-31c3-40fd-97ca-3522817405ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.747964 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc70e85-31c3-40fd-97ca-3522817405ad-kube-api-access-2rjz2" (OuterVolumeSpecName: "kube-api-access-2rjz2") pod "4bc70e85-31c3-40fd-97ca-3522817405ad" (UID: "4bc70e85-31c3-40fd-97ca-3522817405ad"). InnerVolumeSpecName "kube-api-access-2rjz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.777102 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bc70e85-31c3-40fd-97ca-3522817405ad" (UID: "4bc70e85-31c3-40fd-97ca-3522817405ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.786956 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-config-data" (OuterVolumeSpecName: "config-data") pod "4bc70e85-31c3-40fd-97ca-3522817405ad" (UID: "4bc70e85-31c3-40fd-97ca-3522817405ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.842542 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rjz2\" (UniqueName: \"kubernetes.io/projected/4bc70e85-31c3-40fd-97ca-3522817405ad-kube-api-access-2rjz2\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.842585 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.842596 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:39 crc kubenswrapper[4822]: I1010 06:45:39.842605 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc70e85-31c3-40fd-97ca-3522817405ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.343765 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p9vff" event={"ID":"4bc70e85-31c3-40fd-97ca-3522817405ad","Type":"ContainerDied","Data":"1e3808741512809f0029096ce185f320be1480625002c106af73dd00d1010f79"} Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.343819 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e3808741512809f0029096ce185f320be1480625002c106af73dd00d1010f79" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.343867 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p9vff" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.404872 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 06:45:40 crc kubenswrapper[4822]: E1010 06:45:40.405283 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc70e85-31c3-40fd-97ca-3522817405ad" containerName="nova-cell1-conductor-db-sync" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.405300 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc70e85-31c3-40fd-97ca-3522817405ad" containerName="nova-cell1-conductor-db-sync" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.405496 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc70e85-31c3-40fd-97ca-3522817405ad" containerName="nova-cell1-conductor-db-sync" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.406117 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.408899 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.414890 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.454912 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.454959 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt7nc\" (UniqueName: \"kubernetes.io/projected/c7ab4fbc-298d-4250-bf01-a73155f35532-kube-api-access-qt7nc\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.454977 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.556686 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.556726 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt7nc\" (UniqueName: \"kubernetes.io/projected/c7ab4fbc-298d-4250-bf01-a73155f35532-kube-api-access-qt7nc\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.556756 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.561496 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.564332 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.575270 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt7nc\" (UniqueName: \"kubernetes.io/projected/c7ab4fbc-298d-4250-bf01-a73155f35532-kube-api-access-qt7nc\") pod \"nova-cell1-conductor-0\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:40 crc kubenswrapper[4822]: I1010 06:45:40.733837 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.208629 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.209268 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.275590 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-config-data\") pod \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.276016 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-combined-ca-bundle\") pod \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.276113 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfbjk\" (UniqueName: \"kubernetes.io/projected/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-kube-api-access-qfbjk\") pod \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.276154 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-logs\") pod \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\" (UID: \"9d06dcf6-eff3-4b8a-a9c2-12ac34696221\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.277094 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-logs" (OuterVolumeSpecName: "logs") pod "9d06dcf6-eff3-4b8a-a9c2-12ac34696221" (UID: "9d06dcf6-eff3-4b8a-a9c2-12ac34696221"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.284392 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-kube-api-access-qfbjk" (OuterVolumeSpecName: "kube-api-access-qfbjk") pod "9d06dcf6-eff3-4b8a-a9c2-12ac34696221" (UID: "9d06dcf6-eff3-4b8a-a9c2-12ac34696221"). InnerVolumeSpecName "kube-api-access-qfbjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.309230 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-config-data" (OuterVolumeSpecName: "config-data") pod "9d06dcf6-eff3-4b8a-a9c2-12ac34696221" (UID: "9d06dcf6-eff3-4b8a-a9c2-12ac34696221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.319545 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d06dcf6-eff3-4b8a-a9c2-12ac34696221" (UID: "9d06dcf6-eff3-4b8a-a9c2-12ac34696221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.355501 4822 generic.go:334] "Generic (PLEG): container finished" podID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerID="4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774" exitCode=0 Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.355581 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d06dcf6-eff3-4b8a-a9c2-12ac34696221","Type":"ContainerDied","Data":"4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774"} Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.355606 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d06dcf6-eff3-4b8a-a9c2-12ac34696221","Type":"ContainerDied","Data":"443266395f0f3afd0393b66b8e5aef910c81fe7b36148b6c329b8bb3d27dae54"} Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.355621 4822 scope.go:117] "RemoveContainer" containerID="4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.355724 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.360352 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7ab4fbc-298d-4250-bf01-a73155f35532","Type":"ContainerStarted","Data":"2be616b0d8528d462efbcbdc4480ebb3fbbfa728eac5db6dd39a36ed9132cbc1"} Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.366456 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerStarted","Data":"b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f"} Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.366910 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.370214 4822 generic.go:334] "Generic (PLEG): container finished" podID="536625e7-3444-4b3f-bd94-c9fb49e997fb" containerID="b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a" exitCode=0 Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.370259 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"536625e7-3444-4b3f-bd94-c9fb49e997fb","Type":"ContainerDied","Data":"b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a"} Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.378335 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.378365 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.378380 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfbjk\" (UniqueName: \"kubernetes.io/projected/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-kube-api-access-qfbjk\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.378392 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d06dcf6-eff3-4b8a-a9c2-12ac34696221-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.386405 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.143321441 podStartE2EDuration="5.386386016s" podCreationTimestamp="2025-10-10 06:45:36 +0000 UTC" firstStartedPulling="2025-10-10 06:45:37.214981201 +0000 UTC m=+1284.310139397" lastFinishedPulling="2025-10-10 06:45:40.458045776 +0000 UTC m=+1287.553203972" observedRunningTime="2025-10-10 06:45:41.386120999 +0000 UTC m=+1288.481279215" watchObservedRunningTime="2025-10-10 06:45:41.386386016 +0000 UTC m=+1288.481544222" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.397984 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.418311 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.425950 4822 scope.go:117] "RemoveContainer" containerID="a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.436119 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.449794 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:41 crc kubenswrapper[4822]: E1010 06:45:41.450266 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-log" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.450286 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-log" Oct 10 06:45:41 crc kubenswrapper[4822]: E1010 06:45:41.450299 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536625e7-3444-4b3f-bd94-c9fb49e997fb" containerName="nova-scheduler-scheduler" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.450307 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="536625e7-3444-4b3f-bd94-c9fb49e997fb" containerName="nova-scheduler-scheduler" Oct 10 06:45:41 crc kubenswrapper[4822]: E1010 06:45:41.450318 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-api" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.450324 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-api" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.450548 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-api" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.450565 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" containerName="nova-api-log" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.450588 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="536625e7-3444-4b3f-bd94-c9fb49e997fb" containerName="nova-scheduler-scheduler" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.451623 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.455340 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481249 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-config-data\") pod \"536625e7-3444-4b3f-bd94-c9fb49e997fb\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481292 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2wzs\" (UniqueName: \"kubernetes.io/projected/536625e7-3444-4b3f-bd94-c9fb49e997fb-kube-api-access-d2wzs\") pod \"536625e7-3444-4b3f-bd94-c9fb49e997fb\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481393 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-combined-ca-bundle\") pod \"536625e7-3444-4b3f-bd94-c9fb49e997fb\" (UID: \"536625e7-3444-4b3f-bd94-c9fb49e997fb\") " Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481740 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c550619-32d2-4dde-9725-4fe2914dc709-logs\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481824 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rgmm\" (UniqueName: \"kubernetes.io/projected/0c550619-32d2-4dde-9725-4fe2914dc709-kube-api-access-7rgmm\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481859 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.481945 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-config-data\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.482078 4822 scope.go:117] "RemoveContainer" containerID="4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774" Oct 10 06:45:41 crc kubenswrapper[4822]: E1010 06:45:41.483072 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774\": container with ID starting with 4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774 not found: ID does not exist" containerID="4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.483106 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774"} err="failed to get container status \"4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774\": rpc error: code = NotFound desc = could not find container \"4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774\": container with ID starting with 4fc78cad0a8fa4f53d663564a727b42123b83f415a5b8ba5a1259a180bc71774 not found: ID does not exist" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.483127 4822 scope.go:117] "RemoveContainer" containerID="a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c" Oct 10 06:45:41 crc kubenswrapper[4822]: E1010 06:45:41.486831 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c\": container with ID starting with a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c not found: ID does not exist" containerID="a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.486872 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c"} err="failed to get container status \"a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c\": rpc error: code = NotFound desc = could not find container \"a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c\": container with ID starting with a15cb364d15e8f9a1c64ca6c911abf5a72a8d534249bc422b41626664de1826c not found: ID does not exist" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.491896 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.496225 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536625e7-3444-4b3f-bd94-c9fb49e997fb-kube-api-access-d2wzs" (OuterVolumeSpecName: "kube-api-access-d2wzs") pod "536625e7-3444-4b3f-bd94-c9fb49e997fb" (UID: "536625e7-3444-4b3f-bd94-c9fb49e997fb"). InnerVolumeSpecName "kube-api-access-d2wzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.507221 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "536625e7-3444-4b3f-bd94-c9fb49e997fb" (UID: "536625e7-3444-4b3f-bd94-c9fb49e997fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.515909 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-config-data" (OuterVolumeSpecName: "config-data") pod "536625e7-3444-4b3f-bd94-c9fb49e997fb" (UID: "536625e7-3444-4b3f-bd94-c9fb49e997fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584228 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c550619-32d2-4dde-9725-4fe2914dc709-logs\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584336 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rgmm\" (UniqueName: \"kubernetes.io/projected/0c550619-32d2-4dde-9725-4fe2914dc709-kube-api-access-7rgmm\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584405 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584511 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-config-data\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584580 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584598 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2wzs\" (UniqueName: \"kubernetes.io/projected/536625e7-3444-4b3f-bd94-c9fb49e997fb-kube-api-access-d2wzs\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.584612 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536625e7-3444-4b3f-bd94-c9fb49e997fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.585258 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c550619-32d2-4dde-9725-4fe2914dc709-logs\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.589676 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.589720 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-config-data\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.607373 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rgmm\" (UniqueName: \"kubernetes.io/projected/0c550619-32d2-4dde-9725-4fe2914dc709-kube-api-access-7rgmm\") pod \"nova-api-0\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.672039 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d06dcf6-eff3-4b8a-a9c2-12ac34696221" path="/var/lib/kubelet/pods/9d06dcf6-eff3-4b8a-a9c2-12ac34696221/volumes" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.701022 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.777556 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.804170 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 06:45:41 crc kubenswrapper[4822]: I1010 06:45:41.804222 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.268538 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.401597 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7ab4fbc-298d-4250-bf01-a73155f35532","Type":"ContainerStarted","Data":"6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404"} Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.402154 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.408143 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c550619-32d2-4dde-9725-4fe2914dc709","Type":"ContainerStarted","Data":"b6cee150d43952140c52e7d5376b6860f6e36510a654b4dd64c3207283addfac"} Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.410764 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.411025 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"536625e7-3444-4b3f-bd94-c9fb49e997fb","Type":"ContainerDied","Data":"c1f7116a9c1b49aaae73d02c2fc4fdf8a53cd6774e1272f80a57b4855120384b"} Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.411079 4822 scope.go:117] "RemoveContainer" containerID="b6ec36999229a94f1346112a31c2002bce15998c0f7cd9160cd4bae91ee0194a" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.440581 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.4405588209999998 podStartE2EDuration="2.440558821s" podCreationTimestamp="2025-10-10 06:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:42.419446889 +0000 UTC m=+1289.514605105" watchObservedRunningTime="2025-10-10 06:45:42.440558821 +0000 UTC m=+1289.535717017" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.446173 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.469733 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.490007 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.491236 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.494247 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.500679 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.500785 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-config-data\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.501076 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brgc\" (UniqueName: \"kubernetes.io/projected/f9625956-b34e-47ff-9685-db1f2aef4898-kube-api-access-7brgc\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.504024 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.602867 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.603179 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-config-data\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.603209 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7brgc\" (UniqueName: \"kubernetes.io/projected/f9625956-b34e-47ff-9685-db1f2aef4898-kube-api-access-7brgc\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.611314 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-config-data\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.611522 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.619946 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brgc\" (UniqueName: \"kubernetes.io/projected/f9625956-b34e-47ff-9685-db1f2aef4898-kube-api-access-7brgc\") pod \"nova-scheduler-0\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " pod="openstack/nova-scheduler-0" Oct 10 06:45:42 crc kubenswrapper[4822]: I1010 06:45:42.813596 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:45:43 crc kubenswrapper[4822]: I1010 06:45:43.317278 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:45:43 crc kubenswrapper[4822]: I1010 06:45:43.421636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9625956-b34e-47ff-9685-db1f2aef4898","Type":"ContainerStarted","Data":"1ec46fc2f39c0c0f08b9103d3bc874b67b189053f8880166cbee520fdb2a85e8"} Oct 10 06:45:43 crc kubenswrapper[4822]: I1010 06:45:43.425970 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c550619-32d2-4dde-9725-4fe2914dc709","Type":"ContainerStarted","Data":"c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122"} Oct 10 06:45:43 crc kubenswrapper[4822]: I1010 06:45:43.426016 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c550619-32d2-4dde-9725-4fe2914dc709","Type":"ContainerStarted","Data":"9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc"} Oct 10 06:45:43 crc kubenswrapper[4822]: I1010 06:45:43.450340 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.450318218 podStartE2EDuration="2.450318218s" podCreationTimestamp="2025-10-10 06:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:43.44762079 +0000 UTC m=+1290.542778996" watchObservedRunningTime="2025-10-10 06:45:43.450318218 +0000 UTC m=+1290.545476434" Oct 10 06:45:43 crc kubenswrapper[4822]: I1010 06:45:43.663462 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536625e7-3444-4b3f-bd94-c9fb49e997fb" path="/var/lib/kubelet/pods/536625e7-3444-4b3f-bd94-c9fb49e997fb/volumes" Oct 10 06:45:44 crc kubenswrapper[4822]: I1010 06:45:44.446530 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9625956-b34e-47ff-9685-db1f2aef4898","Type":"ContainerStarted","Data":"690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495"} Oct 10 06:45:44 crc kubenswrapper[4822]: I1010 06:45:44.474695 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.474674059 podStartE2EDuration="2.474674059s" podCreationTimestamp="2025-10-10 06:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:45:44.465682469 +0000 UTC m=+1291.560840695" watchObservedRunningTime="2025-10-10 06:45:44.474674059 +0000 UTC m=+1291.569832265" Oct 10 06:45:46 crc kubenswrapper[4822]: I1010 06:45:46.805405 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 06:45:46 crc kubenswrapper[4822]: I1010 06:45:46.806315 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 06:45:47 crc kubenswrapper[4822]: I1010 06:45:47.815077 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 06:45:47 crc kubenswrapper[4822]: I1010 06:45:47.819015 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 06:45:47 crc kubenswrapper[4822]: I1010 06:45:47.819105 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 06:45:50 crc kubenswrapper[4822]: I1010 06:45:50.770017 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 06:45:51 crc kubenswrapper[4822]: I1010 06:45:51.778663 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 06:45:51 crc kubenswrapper[4822]: I1010 06:45:51.778751 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 06:45:52 crc kubenswrapper[4822]: I1010 06:45:52.814884 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 06:45:52 crc kubenswrapper[4822]: I1010 06:45:52.858042 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 06:45:52 crc kubenswrapper[4822]: I1010 06:45:52.860985 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 06:45:52 crc kubenswrapper[4822]: I1010 06:45:52.860974 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 06:45:53 crc kubenswrapper[4822]: I1010 06:45:53.566230 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 06:45:56 crc kubenswrapper[4822]: I1010 06:45:56.810396 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 06:45:56 crc kubenswrapper[4822]: I1010 06:45:56.810979 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 06:45:56 crc kubenswrapper[4822]: I1010 06:45:56.816311 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 06:45:56 crc kubenswrapper[4822]: I1010 06:45:56.817168 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.562491 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.601872 4822 generic.go:334] "Generic (PLEG): container finished" podID="fddbc361-4d41-4cab-9baf-aa1dfad98431" containerID="272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3" exitCode=137 Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.601923 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.601929 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fddbc361-4d41-4cab-9baf-aa1dfad98431","Type":"ContainerDied","Data":"272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3"} Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.602055 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fddbc361-4d41-4cab-9baf-aa1dfad98431","Type":"ContainerDied","Data":"21f6228be49ef6a7badbd525ea1579ccd19b5d371d032867d61bf4dc9930696b"} Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.602076 4822 scope.go:117] "RemoveContainer" containerID="272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.629991 4822 scope.go:117] "RemoveContainer" containerID="272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3" Oct 10 06:45:59 crc kubenswrapper[4822]: E1010 06:45:59.630561 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3\": container with ID starting with 272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3 not found: ID does not exist" containerID="272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.630610 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3"} err="failed to get container status \"272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3\": rpc error: code = NotFound desc = could not find container \"272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3\": container with ID starting with 272b0b4d3757b0d392de046a8d8e8343fe5103c100e3b794784a053878f075a3 not found: ID does not exist" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.730619 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trgh9\" (UniqueName: \"kubernetes.io/projected/fddbc361-4d41-4cab-9baf-aa1dfad98431-kube-api-access-trgh9\") pod \"fddbc361-4d41-4cab-9baf-aa1dfad98431\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.730693 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-combined-ca-bundle\") pod \"fddbc361-4d41-4cab-9baf-aa1dfad98431\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.730756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-config-data\") pod \"fddbc361-4d41-4cab-9baf-aa1dfad98431\" (UID: \"fddbc361-4d41-4cab-9baf-aa1dfad98431\") " Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.739056 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fddbc361-4d41-4cab-9baf-aa1dfad98431-kube-api-access-trgh9" (OuterVolumeSpecName: "kube-api-access-trgh9") pod "fddbc361-4d41-4cab-9baf-aa1dfad98431" (UID: "fddbc361-4d41-4cab-9baf-aa1dfad98431"). InnerVolumeSpecName "kube-api-access-trgh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.768105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fddbc361-4d41-4cab-9baf-aa1dfad98431" (UID: "fddbc361-4d41-4cab-9baf-aa1dfad98431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.773698 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-config-data" (OuterVolumeSpecName: "config-data") pod "fddbc361-4d41-4cab-9baf-aa1dfad98431" (UID: "fddbc361-4d41-4cab-9baf-aa1dfad98431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.833410 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.833448 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fddbc361-4d41-4cab-9baf-aa1dfad98431-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.833464 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trgh9\" (UniqueName: \"kubernetes.io/projected/fddbc361-4d41-4cab-9baf-aa1dfad98431-kube-api-access-trgh9\") on node \"crc\" DevicePath \"\"" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.943226 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.965174 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.973163 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:59 crc kubenswrapper[4822]: E1010 06:45:59.973964 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fddbc361-4d41-4cab-9baf-aa1dfad98431" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.974010 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddbc361-4d41-4cab-9baf-aa1dfad98431" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.990683 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="fddbc361-4d41-4cab-9baf-aa1dfad98431" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.991494 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.991600 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.994348 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.994491 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 06:45:59 crc kubenswrapper[4822]: I1010 06:45:59.994565 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.138131 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.138338 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.138447 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.138542 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.138609 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glb7l\" (UniqueName: \"kubernetes.io/projected/419c8ee7-56fd-43cc-86de-7f647c708502-kube-api-access-glb7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.240398 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.240506 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.240569 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.240622 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glb7l\" (UniqueName: \"kubernetes.io/projected/419c8ee7-56fd-43cc-86de-7f647c708502-kube-api-access-glb7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.240664 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.245495 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.246285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.247038 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.248203 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.259864 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glb7l\" (UniqueName: \"kubernetes.io/projected/419c8ee7-56fd-43cc-86de-7f647c708502-kube-api-access-glb7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.309702 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:00 crc kubenswrapper[4822]: I1010 06:46:00.727307 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.337050 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.337372 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.622594 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"419c8ee7-56fd-43cc-86de-7f647c708502","Type":"ContainerStarted","Data":"54d693b162a1803c2583c5834b198cb4544f2c4815b65ef898c0cea5e7667146"} Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.622893 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"419c8ee7-56fd-43cc-86de-7f647c708502","Type":"ContainerStarted","Data":"236d517d91ce9915087f2a579c916e82cb4c9fe893ae6001daac69e5108647cf"} Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.650757 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.650736195 podStartE2EDuration="2.650736195s" podCreationTimestamp="2025-10-10 06:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:01.646330708 +0000 UTC m=+1308.741488914" watchObservedRunningTime="2025-10-10 06:46:01.650736195 +0000 UTC m=+1308.745894391" Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.668069 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fddbc361-4d41-4cab-9baf-aa1dfad98431" path="/var/lib/kubelet/pods/fddbc361-4d41-4cab-9baf-aa1dfad98431/volumes" Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.781123 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.781765 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.784193 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 06:46:01 crc kubenswrapper[4822]: I1010 06:46:01.787521 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.634139 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.637862 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.840595 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9h2hz"] Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.845513 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.858889 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9h2hz"] Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.992002 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.992063 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.992107 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.992156 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjfj\" (UniqueName: \"kubernetes.io/projected/999b3a9f-9559-4baa-9f36-4f91631fb1fc-kube-api-access-5fjfj\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.992188 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-config\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:02 crc kubenswrapper[4822]: I1010 06:46:02.992205 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.094540 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.094840 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.094886 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.094938 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fjfj\" (UniqueName: \"kubernetes.io/projected/999b3a9f-9559-4baa-9f36-4f91631fb1fc-kube-api-access-5fjfj\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.094973 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-config\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.094996 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.095975 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.096221 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.096569 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-config\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.097076 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.097584 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.126156 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fjfj\" (UniqueName: \"kubernetes.io/projected/999b3a9f-9559-4baa-9f36-4f91631fb1fc-kube-api-access-5fjfj\") pod \"dnsmasq-dns-89c5cd4d5-9h2hz\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.180917 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:03 crc kubenswrapper[4822]: I1010 06:46:03.775779 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9h2hz"] Oct 10 06:46:04 crc kubenswrapper[4822]: I1010 06:46:04.650201 4822 generic.go:334] "Generic (PLEG): container finished" podID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerID="1efb57ad3e93a9791822615f0a82caa8998c856b0e2fc40cd1c40d4fa6e98d56" exitCode=0 Oct 10 06:46:04 crc kubenswrapper[4822]: I1010 06:46:04.650251 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" event={"ID":"999b3a9f-9559-4baa-9f36-4f91631fb1fc","Type":"ContainerDied","Data":"1efb57ad3e93a9791822615f0a82caa8998c856b0e2fc40cd1c40d4fa6e98d56"} Oct 10 06:46:04 crc kubenswrapper[4822]: I1010 06:46:04.650560 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" event={"ID":"999b3a9f-9559-4baa-9f36-4f91631fb1fc","Type":"ContainerStarted","Data":"da6b134ceab1bb885c711edb9da894b922f48bb12a8888eb6880b76e528a90c9"} Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.039514 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.040025 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-central-agent" containerID="cri-o://f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929" gracePeriod=30 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.040134 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="proxy-httpd" containerID="cri-o://b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f" gracePeriod=30 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.040171 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="sg-core" containerID="cri-o://775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49" gracePeriod=30 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.040323 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-notification-agent" containerID="cri-o://73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706" gracePeriod=30 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.056234 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": EOF" Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.310861 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.643622 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.663095 4822 generic.go:334] "Generic (PLEG): container finished" podID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerID="b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f" exitCode=0 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.663133 4822 generic.go:334] "Generic (PLEG): container finished" podID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerID="775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49" exitCode=2 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.663144 4822 generic.go:334] "Generic (PLEG): container finished" podID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerID="f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929" exitCode=0 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.663326 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-log" containerID="cri-o://9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc" gracePeriod=30 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.663444 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-api" containerID="cri-o://c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122" gracePeriod=30 Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.672840 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" event={"ID":"999b3a9f-9559-4baa-9f36-4f91631fb1fc","Type":"ContainerStarted","Data":"203a642bf3c3f9b4af40226e7974ac31a360eac399b610e485b7214df187d558"} Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.672922 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.672938 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerDied","Data":"b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f"} Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.672959 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerDied","Data":"775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49"} Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.672972 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerDied","Data":"f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929"} Oct 10 06:46:05 crc kubenswrapper[4822]: I1010 06:46:05.685704 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" podStartSLOduration=3.685682859 podStartE2EDuration="3.685682859s" podCreationTimestamp="2025-10-10 06:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:05.677918185 +0000 UTC m=+1312.773076391" watchObservedRunningTime="2025-10-10 06:46:05.685682859 +0000 UTC m=+1312.780841055" Oct 10 06:46:06 crc kubenswrapper[4822]: I1010 06:46:06.673126 4822 generic.go:334] "Generic (PLEG): container finished" podID="0c550619-32d2-4dde-9725-4fe2914dc709" containerID="9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc" exitCode=143 Oct 10 06:46:06 crc kubenswrapper[4822]: I1010 06:46:06.673198 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c550619-32d2-4dde-9725-4fe2914dc709","Type":"ContainerDied","Data":"9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc"} Oct 10 06:46:06 crc kubenswrapper[4822]: I1010 06:46:06.749535 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": dial tcp 10.217.0.192:3000: connect: connection refused" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.347914 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.417546 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-config-data\") pod \"0c550619-32d2-4dde-9725-4fe2914dc709\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.417641 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c550619-32d2-4dde-9725-4fe2914dc709-logs\") pod \"0c550619-32d2-4dde-9725-4fe2914dc709\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.417693 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-combined-ca-bundle\") pod \"0c550619-32d2-4dde-9725-4fe2914dc709\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.417726 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rgmm\" (UniqueName: \"kubernetes.io/projected/0c550619-32d2-4dde-9725-4fe2914dc709-kube-api-access-7rgmm\") pod \"0c550619-32d2-4dde-9725-4fe2914dc709\" (UID: \"0c550619-32d2-4dde-9725-4fe2914dc709\") " Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.418163 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c550619-32d2-4dde-9725-4fe2914dc709-logs" (OuterVolumeSpecName: "logs") pod "0c550619-32d2-4dde-9725-4fe2914dc709" (UID: "0c550619-32d2-4dde-9725-4fe2914dc709"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.423383 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c550619-32d2-4dde-9725-4fe2914dc709-kube-api-access-7rgmm" (OuterVolumeSpecName: "kube-api-access-7rgmm") pod "0c550619-32d2-4dde-9725-4fe2914dc709" (UID: "0c550619-32d2-4dde-9725-4fe2914dc709"). InnerVolumeSpecName "kube-api-access-7rgmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.456499 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c550619-32d2-4dde-9725-4fe2914dc709" (UID: "0c550619-32d2-4dde-9725-4fe2914dc709"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.468261 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-config-data" (OuterVolumeSpecName: "config-data") pod "0c550619-32d2-4dde-9725-4fe2914dc709" (UID: "0c550619-32d2-4dde-9725-4fe2914dc709"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.520993 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rgmm\" (UniqueName: \"kubernetes.io/projected/0c550619-32d2-4dde-9725-4fe2914dc709-kube-api-access-7rgmm\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.521041 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.521054 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c550619-32d2-4dde-9725-4fe2914dc709-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.521066 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c550619-32d2-4dde-9725-4fe2914dc709-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.702651 4822 generic.go:334] "Generic (PLEG): container finished" podID="0c550619-32d2-4dde-9725-4fe2914dc709" containerID="c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122" exitCode=0 Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.702712 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.702710 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c550619-32d2-4dde-9725-4fe2914dc709","Type":"ContainerDied","Data":"c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122"} Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.702784 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c550619-32d2-4dde-9725-4fe2914dc709","Type":"ContainerDied","Data":"b6cee150d43952140c52e7d5376b6860f6e36510a654b4dd64c3207283addfac"} Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.702822 4822 scope.go:117] "RemoveContainer" containerID="c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.725274 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.727136 4822 scope.go:117] "RemoveContainer" containerID="9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.736523 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.746992 4822 scope.go:117] "RemoveContainer" containerID="c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122" Oct 10 06:46:09 crc kubenswrapper[4822]: E1010 06:46:09.747436 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122\": container with ID starting with c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122 not found: ID does not exist" containerID="c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.747483 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122"} err="failed to get container status \"c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122\": rpc error: code = NotFound desc = could not find container \"c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122\": container with ID starting with c1c212557000a89b0455d7e5c16a1f4e450d164abc26e4384ae12ed84c5cd122 not found: ID does not exist" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.747524 4822 scope.go:117] "RemoveContainer" containerID="9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc" Oct 10 06:46:09 crc kubenswrapper[4822]: E1010 06:46:09.747828 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc\": container with ID starting with 9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc not found: ID does not exist" containerID="9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.747856 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc"} err="failed to get container status \"9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc\": rpc error: code = NotFound desc = could not find container \"9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc\": container with ID starting with 9689945172fcc7316f8d01fe1e44681914102ec48e00578574e578161a14becc not found: ID does not exist" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.755215 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:09 crc kubenswrapper[4822]: E1010 06:46:09.755778 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-log" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.755795 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-log" Oct 10 06:46:09 crc kubenswrapper[4822]: E1010 06:46:09.755918 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-api" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.755927 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-api" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.756177 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-api" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.756190 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" containerName="nova-api-log" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.757426 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.765637 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.766001 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.766308 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.766778 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.827071 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g2vq\" (UniqueName: \"kubernetes.io/projected/e155ba38-6cb9-4dda-b10d-f081fc7d284f-kube-api-access-5g2vq\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.827260 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-config-data\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.827297 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.827328 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.827403 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.827441 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155ba38-6cb9-4dda-b10d-f081fc7d284f-logs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.929790 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-config-data\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.930031 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.930137 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.930278 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.930394 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155ba38-6cb9-4dda-b10d-f081fc7d284f-logs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.930536 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g2vq\" (UniqueName: \"kubernetes.io/projected/e155ba38-6cb9-4dda-b10d-f081fc7d284f-kube-api-access-5g2vq\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.930830 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155ba38-6cb9-4dda-b10d-f081fc7d284f-logs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.934178 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.934336 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.934385 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-config-data\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.948342 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:09 crc kubenswrapper[4822]: I1010 06:46:09.949302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g2vq\" (UniqueName: \"kubernetes.io/projected/e155ba38-6cb9-4dda-b10d-f081fc7d284f-kube-api-access-5g2vq\") pod \"nova-api-0\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " pod="openstack/nova-api-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.086508 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.313209 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.400952 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.457430 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-run-httpd\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.457813 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4qqf\" (UniqueName: \"kubernetes.io/projected/93b47efc-6ac4-4e55-a01c-a865fd89abe8-kube-api-access-h4qqf\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.457854 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-log-httpd\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.457922 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-ceilometer-tls-certs\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.457954 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-config-data\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.458000 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-sg-core-conf-yaml\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.458033 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-combined-ca-bundle\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.458091 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-scripts\") pod \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\" (UID: \"93b47efc-6ac4-4e55-a01c-a865fd89abe8\") " Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.474101 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-scripts" (OuterVolumeSpecName: "scripts") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.474105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.478277 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.487044 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b47efc-6ac4-4e55-a01c-a865fd89abe8-kube-api-access-h4qqf" (OuterVolumeSpecName: "kube-api-access-h4qqf") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "kube-api-access-h4qqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.488259 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.534076 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.545727 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.560636 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.560664 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4qqf\" (UniqueName: \"kubernetes.io/projected/93b47efc-6ac4-4e55-a01c-a865fd89abe8-kube-api-access-h4qqf\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.560675 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b47efc-6ac4-4e55-a01c-a865fd89abe8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.560684 4822 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.560693 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.560700 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.574786 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.601201 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-config-data" (OuterVolumeSpecName: "config-data") pod "93b47efc-6ac4-4e55-a01c-a865fd89abe8" (UID: "93b47efc-6ac4-4e55-a01c-a865fd89abe8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.662613 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.662658 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b47efc-6ac4-4e55-a01c-a865fd89abe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.694696 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.722058 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e155ba38-6cb9-4dda-b10d-f081fc7d284f","Type":"ContainerStarted","Data":"ec6b23b366f01d690c5c2416aa572ba7551515d7b0f10d0d47b08903ac577d77"} Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.728391 4822 generic.go:334] "Generic (PLEG): container finished" podID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerID="73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706" exitCode=0 Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.728464 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.728477 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerDied","Data":"73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706"} Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.728519 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93b47efc-6ac4-4e55-a01c-a865fd89abe8","Type":"ContainerDied","Data":"e596c8f429f98c6fcb09b8eb4c72f72ab798200f99a951d5fbed9b47108db75d"} Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.728545 4822 scope.go:117] "RemoveContainer" containerID="b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.748101 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.751972 4822 scope.go:117] "RemoveContainer" containerID="775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.772055 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.786920 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.801050 4822 scope.go:117] "RemoveContainer" containerID="73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.802872 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.803388 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-notification-agent" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803413 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-notification-agent" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.803433 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="sg-core" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803443 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="sg-core" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.803482 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-central-agent" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803491 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-central-agent" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.803507 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="proxy-httpd" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803514 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="proxy-httpd" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803775 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-central-agent" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803817 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="sg-core" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803830 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="ceilometer-notification-agent" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.803843 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" containerName="proxy-httpd" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.805911 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.811821 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.812582 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.812739 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.846142 4822 scope.go:117] "RemoveContainer" containerID="f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.863306 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.867567 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-log-httpd\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.867729 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-config-data\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.867823 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-run-httpd\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.867932 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.868003 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.868123 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.868199 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d75b\" (UniqueName: \"kubernetes.io/projected/151eccad-6f76-476d-a2f4-53123f29bdb7-kube-api-access-7d75b\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.868290 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-scripts\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.891557 4822 scope.go:117] "RemoveContainer" containerID="b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.893121 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f\": container with ID starting with b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f not found: ID does not exist" containerID="b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.893171 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f"} err="failed to get container status \"b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f\": rpc error: code = NotFound desc = could not find container \"b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f\": container with ID starting with b0ab9114a3aef8df254a335adf95aa134547d1371df314453cff9a406fe2039f not found: ID does not exist" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.893203 4822 scope.go:117] "RemoveContainer" containerID="775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.893568 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49\": container with ID starting with 775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49 not found: ID does not exist" containerID="775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.893624 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49"} err="failed to get container status \"775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49\": rpc error: code = NotFound desc = could not find container \"775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49\": container with ID starting with 775f9fabab7d5979c72991425075022c472e1d8ea72b899a9bf67e2dc592be49 not found: ID does not exist" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.893660 4822 scope.go:117] "RemoveContainer" containerID="73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.898134 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706\": container with ID starting with 73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706 not found: ID does not exist" containerID="73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.898186 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706"} err="failed to get container status \"73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706\": rpc error: code = NotFound desc = could not find container \"73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706\": container with ID starting with 73ba58d6902cc5a8114c1c307024541106cc91898a5f514a0a4153c38f41f706 not found: ID does not exist" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.898219 4822 scope.go:117] "RemoveContainer" containerID="f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929" Oct 10 06:46:10 crc kubenswrapper[4822]: E1010 06:46:10.904371 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929\": container with ID starting with f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929 not found: ID does not exist" containerID="f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.904412 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929"} err="failed to get container status \"f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929\": rpc error: code = NotFound desc = could not find container \"f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929\": container with ID starting with f0581c73dfd53daf737a7d11ac4d986c1d518cf37c775188d3b94b0aacca9929 not found: ID does not exist" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.935844 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ktcrj"] Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.937012 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.944036 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ktcrj"] Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.949188 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.949216 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970131 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d75b\" (UniqueName: \"kubernetes.io/projected/151eccad-6f76-476d-a2f4-53123f29bdb7-kube-api-access-7d75b\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970175 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970222 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-scripts\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970257 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-log-httpd\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970312 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-scripts\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970345 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-config-data\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970362 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-run-httpd\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970402 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970422 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970455 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-config-data\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970481 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzmm9\" (UniqueName: \"kubernetes.io/projected/b9987828-34f4-4a64-ac85-8c9ee937fe1c-kube-api-access-pzmm9\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.970508 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.972675 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-run-httpd\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.972764 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-log-httpd\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.977606 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.977936 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-config-data\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.978112 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-scripts\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.988127 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:10 crc kubenswrapper[4822]: I1010 06:46:10.995589 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.009370 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d75b\" (UniqueName: \"kubernetes.io/projected/151eccad-6f76-476d-a2f4-53123f29bdb7-kube-api-access-7d75b\") pod \"ceilometer-0\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " pod="openstack/ceilometer-0" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.071775 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-scripts\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.071925 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-config-data\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.071957 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzmm9\" (UniqueName: \"kubernetes.io/projected/b9987828-34f4-4a64-ac85-8c9ee937fe1c-kube-api-access-pzmm9\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.071992 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.078139 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-config-data\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.079436 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.079487 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-scripts\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.089492 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzmm9\" (UniqueName: \"kubernetes.io/projected/b9987828-34f4-4a64-ac85-8c9ee937fe1c-kube-api-access-pzmm9\") pod \"nova-cell1-cell-mapping-ktcrj\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.150998 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.247094 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:11 crc kubenswrapper[4822]: W1010 06:46:11.647081 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151eccad_6f76_476d_a2f4_53123f29bdb7.slice/crio-469406a00ece231f5e75c8f55955bf85d697a9eea052ec38a3ddac6cf51ad9a3 WatchSource:0}: Error finding container 469406a00ece231f5e75c8f55955bf85d697a9eea052ec38a3ddac6cf51ad9a3: Status 404 returned error can't find the container with id 469406a00ece231f5e75c8f55955bf85d697a9eea052ec38a3ddac6cf51ad9a3 Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.660902 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c550619-32d2-4dde-9725-4fe2914dc709" path="/var/lib/kubelet/pods/0c550619-32d2-4dde-9725-4fe2914dc709/volumes" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.661984 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b47efc-6ac4-4e55-a01c-a865fd89abe8" path="/var/lib/kubelet/pods/93b47efc-6ac4-4e55-a01c-a865fd89abe8/volumes" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.662864 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.739774 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e155ba38-6cb9-4dda-b10d-f081fc7d284f","Type":"ContainerStarted","Data":"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab"} Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.739839 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e155ba38-6cb9-4dda-b10d-f081fc7d284f","Type":"ContainerStarted","Data":"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758"} Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.749781 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerStarted","Data":"469406a00ece231f5e75c8f55955bf85d697a9eea052ec38a3ddac6cf51ad9a3"} Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.779376 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.779352858 podStartE2EDuration="2.779352858s" podCreationTimestamp="2025-10-10 06:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:11.767388832 +0000 UTC m=+1318.862547038" watchObservedRunningTime="2025-10-10 06:46:11.779352858 +0000 UTC m=+1318.874511074" Oct 10 06:46:11 crc kubenswrapper[4822]: I1010 06:46:11.806862 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ktcrj"] Oct 10 06:46:11 crc kubenswrapper[4822]: W1010 06:46:11.808931 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9987828_34f4_4a64_ac85_8c9ee937fe1c.slice/crio-458d7cbc77be2899316bb537e593090bcfc7b004a4ffb0fcc705b7223d515f10 WatchSource:0}: Error finding container 458d7cbc77be2899316bb537e593090bcfc7b004a4ffb0fcc705b7223d515f10: Status 404 returned error can't find the container with id 458d7cbc77be2899316bb537e593090bcfc7b004a4ffb0fcc705b7223d515f10 Oct 10 06:46:12 crc kubenswrapper[4822]: I1010 06:46:12.761459 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerStarted","Data":"4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f"} Oct 10 06:46:12 crc kubenswrapper[4822]: I1010 06:46:12.764417 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ktcrj" event={"ID":"b9987828-34f4-4a64-ac85-8c9ee937fe1c","Type":"ContainerStarted","Data":"4e20c2d87e457b4faea64c3dea709aa2f899b6ef000fddafdc23dfca5479df95"} Oct 10 06:46:12 crc kubenswrapper[4822]: I1010 06:46:12.764471 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ktcrj" event={"ID":"b9987828-34f4-4a64-ac85-8c9ee937fe1c","Type":"ContainerStarted","Data":"458d7cbc77be2899316bb537e593090bcfc7b004a4ffb0fcc705b7223d515f10"} Oct 10 06:46:12 crc kubenswrapper[4822]: I1010 06:46:12.782614 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ktcrj" podStartSLOduration=2.782595975 podStartE2EDuration="2.782595975s" podCreationTimestamp="2025-10-10 06:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:12.779344841 +0000 UTC m=+1319.874503057" watchObservedRunningTime="2025-10-10 06:46:12.782595975 +0000 UTC m=+1319.877754171" Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.182037 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.276217 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9cmt4"] Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.276668 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerName="dnsmasq-dns" containerID="cri-o://428bf94e5a8032585c4fea0fa4e5da331ef5ad8700f506b032bf5e4e5b727f3d" gracePeriod=10 Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.784855 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerStarted","Data":"449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13"} Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.786977 4822 generic.go:334] "Generic (PLEG): container finished" podID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerID="428bf94e5a8032585c4fea0fa4e5da331ef5ad8700f506b032bf5e4e5b727f3d" exitCode=0 Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.787058 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" event={"ID":"04f0c0de-cff5-4a04-9b80-204c4791f430","Type":"ContainerDied","Data":"428bf94e5a8032585c4fea0fa4e5da331ef5ad8700f506b032bf5e4e5b727f3d"} Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.836592 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.944998 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-config\") pod \"04f0c0de-cff5-4a04-9b80-204c4791f430\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.945081 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-swift-storage-0\") pod \"04f0c0de-cff5-4a04-9b80-204c4791f430\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.945242 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-nb\") pod \"04f0c0de-cff5-4a04-9b80-204c4791f430\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.945313 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-sb\") pod \"04f0c0de-cff5-4a04-9b80-204c4791f430\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.945377 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-svc\") pod \"04f0c0de-cff5-4a04-9b80-204c4791f430\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.945863 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjgx5\" (UniqueName: \"kubernetes.io/projected/04f0c0de-cff5-4a04-9b80-204c4791f430-kube-api-access-pjgx5\") pod \"04f0c0de-cff5-4a04-9b80-204c4791f430\" (UID: \"04f0c0de-cff5-4a04-9b80-204c4791f430\") " Oct 10 06:46:13 crc kubenswrapper[4822]: I1010 06:46:13.950190 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f0c0de-cff5-4a04-9b80-204c4791f430-kube-api-access-pjgx5" (OuterVolumeSpecName: "kube-api-access-pjgx5") pod "04f0c0de-cff5-4a04-9b80-204c4791f430" (UID: "04f0c0de-cff5-4a04-9b80-204c4791f430"). InnerVolumeSpecName "kube-api-access-pjgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.004686 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-config" (OuterVolumeSpecName: "config") pod "04f0c0de-cff5-4a04-9b80-204c4791f430" (UID: "04f0c0de-cff5-4a04-9b80-204c4791f430"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.008355 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04f0c0de-cff5-4a04-9b80-204c4791f430" (UID: "04f0c0de-cff5-4a04-9b80-204c4791f430"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.011415 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04f0c0de-cff5-4a04-9b80-204c4791f430" (UID: "04f0c0de-cff5-4a04-9b80-204c4791f430"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.028563 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04f0c0de-cff5-4a04-9b80-204c4791f430" (UID: "04f0c0de-cff5-4a04-9b80-204c4791f430"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.039122 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04f0c0de-cff5-4a04-9b80-204c4791f430" (UID: "04f0c0de-cff5-4a04-9b80-204c4791f430"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.050521 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.050550 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.050566 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.050577 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjgx5\" (UniqueName: \"kubernetes.io/projected/04f0c0de-cff5-4a04-9b80-204c4791f430-kube-api-access-pjgx5\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.050587 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.050596 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f0c0de-cff5-4a04-9b80-204c4791f430-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.798866 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerStarted","Data":"80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46"} Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.800793 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" event={"ID":"04f0c0de-cff5-4a04-9b80-204c4791f430","Type":"ContainerDied","Data":"a9533b6facc57803587c8a5df3107fb924a3f02438dee24f313c85d755c21758"} Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.800860 4822 scope.go:117] "RemoveContainer" containerID="428bf94e5a8032585c4fea0fa4e5da331ef5ad8700f506b032bf5e4e5b727f3d" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.800889 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9cmt4" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.825134 4822 scope.go:117] "RemoveContainer" containerID="72fd1f10fe85a512180703a8a77bc5c119e8cb266dce3be3540c479f736477cb" Oct 10 06:46:14 crc kubenswrapper[4822]: I1010 06:46:14.980884 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9cmt4"] Oct 10 06:46:15 crc kubenswrapper[4822]: I1010 06:46:15.015621 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9cmt4"] Oct 10 06:46:15 crc kubenswrapper[4822]: I1010 06:46:15.662466 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" path="/var/lib/kubelet/pods/04f0c0de-cff5-4a04-9b80-204c4791f430/volumes" Oct 10 06:46:15 crc kubenswrapper[4822]: I1010 06:46:15.811396 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerStarted","Data":"8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9"} Oct 10 06:46:15 crc kubenswrapper[4822]: I1010 06:46:15.811879 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 06:46:16 crc kubenswrapper[4822]: I1010 06:46:16.823921 4822 generic.go:334] "Generic (PLEG): container finished" podID="b9987828-34f4-4a64-ac85-8c9ee937fe1c" containerID="4e20c2d87e457b4faea64c3dea709aa2f899b6ef000fddafdc23dfca5479df95" exitCode=0 Oct 10 06:46:16 crc kubenswrapper[4822]: I1010 06:46:16.825062 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ktcrj" event={"ID":"b9987828-34f4-4a64-ac85-8c9ee937fe1c","Type":"ContainerDied","Data":"4e20c2d87e457b4faea64c3dea709aa2f899b6ef000fddafdc23dfca5479df95"} Oct 10 06:46:16 crc kubenswrapper[4822]: I1010 06:46:16.847023 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.520711409 podStartE2EDuration="6.847002891s" podCreationTimestamp="2025-10-10 06:46:10 +0000 UTC" firstStartedPulling="2025-10-10 06:46:11.651667548 +0000 UTC m=+1318.746825744" lastFinishedPulling="2025-10-10 06:46:14.97795903 +0000 UTC m=+1322.073117226" observedRunningTime="2025-10-10 06:46:15.835227767 +0000 UTC m=+1322.930385983" watchObservedRunningTime="2025-10-10 06:46:16.847002891 +0000 UTC m=+1323.942161087" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.249483 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.362318 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-combined-ca-bundle\") pod \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.362474 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzmm9\" (UniqueName: \"kubernetes.io/projected/b9987828-34f4-4a64-ac85-8c9ee937fe1c-kube-api-access-pzmm9\") pod \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.362608 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-scripts\") pod \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.362862 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-config-data\") pod \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\" (UID: \"b9987828-34f4-4a64-ac85-8c9ee937fe1c\") " Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.376012 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9987828-34f4-4a64-ac85-8c9ee937fe1c-kube-api-access-pzmm9" (OuterVolumeSpecName: "kube-api-access-pzmm9") pod "b9987828-34f4-4a64-ac85-8c9ee937fe1c" (UID: "b9987828-34f4-4a64-ac85-8c9ee937fe1c"). InnerVolumeSpecName "kube-api-access-pzmm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.376287 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-scripts" (OuterVolumeSpecName: "scripts") pod "b9987828-34f4-4a64-ac85-8c9ee937fe1c" (UID: "b9987828-34f4-4a64-ac85-8c9ee937fe1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.392260 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-config-data" (OuterVolumeSpecName: "config-data") pod "b9987828-34f4-4a64-ac85-8c9ee937fe1c" (UID: "b9987828-34f4-4a64-ac85-8c9ee937fe1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.401860 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9987828-34f4-4a64-ac85-8c9ee937fe1c" (UID: "b9987828-34f4-4a64-ac85-8c9ee937fe1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.464747 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzmm9\" (UniqueName: \"kubernetes.io/projected/b9987828-34f4-4a64-ac85-8c9ee937fe1c-kube-api-access-pzmm9\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.464785 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.464795 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.464825 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9987828-34f4-4a64-ac85-8c9ee937fe1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.846747 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ktcrj" event={"ID":"b9987828-34f4-4a64-ac85-8c9ee937fe1c","Type":"ContainerDied","Data":"458d7cbc77be2899316bb537e593090bcfc7b004a4ffb0fcc705b7223d515f10"} Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.846864 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458d7cbc77be2899316bb537e593090bcfc7b004a4ffb0fcc705b7223d515f10" Oct 10 06:46:18 crc kubenswrapper[4822]: I1010 06:46:18.846878 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ktcrj" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.047367 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.047665 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-log" containerID="cri-o://78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758" gracePeriod=30 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.047969 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-api" containerID="cri-o://2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab" gracePeriod=30 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.060943 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.061177 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f9625956-b34e-47ff-9685-db1f2aef4898" containerName="nova-scheduler-scheduler" containerID="cri-o://690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495" gracePeriod=30 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.081217 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.081459 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-log" containerID="cri-o://a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743" gracePeriod=30 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.081599 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-metadata" containerID="cri-o://225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4" gracePeriod=30 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.652682 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.687259 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-internal-tls-certs\") pod \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.687336 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g2vq\" (UniqueName: \"kubernetes.io/projected/e155ba38-6cb9-4dda-b10d-f081fc7d284f-kube-api-access-5g2vq\") pod \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.687436 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155ba38-6cb9-4dda-b10d-f081fc7d284f-logs\") pod \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.687541 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-public-tls-certs\") pod \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.687575 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-combined-ca-bundle\") pod \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.687597 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-config-data\") pod \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\" (UID: \"e155ba38-6cb9-4dda-b10d-f081fc7d284f\") " Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.689474 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e155ba38-6cb9-4dda-b10d-f081fc7d284f-logs" (OuterVolumeSpecName: "logs") pod "e155ba38-6cb9-4dda-b10d-f081fc7d284f" (UID: "e155ba38-6cb9-4dda-b10d-f081fc7d284f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.700176 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e155ba38-6cb9-4dda-b10d-f081fc7d284f-kube-api-access-5g2vq" (OuterVolumeSpecName: "kube-api-access-5g2vq") pod "e155ba38-6cb9-4dda-b10d-f081fc7d284f" (UID: "e155ba38-6cb9-4dda-b10d-f081fc7d284f"). InnerVolumeSpecName "kube-api-access-5g2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.736166 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e155ba38-6cb9-4dda-b10d-f081fc7d284f" (UID: "e155ba38-6cb9-4dda-b10d-f081fc7d284f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.745016 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-config-data" (OuterVolumeSpecName: "config-data") pod "e155ba38-6cb9-4dda-b10d-f081fc7d284f" (UID: "e155ba38-6cb9-4dda-b10d-f081fc7d284f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.748979 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e155ba38-6cb9-4dda-b10d-f081fc7d284f" (UID: "e155ba38-6cb9-4dda-b10d-f081fc7d284f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.761519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e155ba38-6cb9-4dda-b10d-f081fc7d284f" (UID: "e155ba38-6cb9-4dda-b10d-f081fc7d284f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.789351 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.789663 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.789759 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.789835 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e155ba38-6cb9-4dda-b10d-f081fc7d284f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.789894 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g2vq\" (UniqueName: \"kubernetes.io/projected/e155ba38-6cb9-4dda-b10d-f081fc7d284f-kube-api-access-5g2vq\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.789962 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e155ba38-6cb9-4dda-b10d-f081fc7d284f-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.857386 4822 generic.go:334] "Generic (PLEG): container finished" podID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerID="a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743" exitCode=143 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.857459 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79","Type":"ContainerDied","Data":"a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743"} Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859191 4822 generic.go:334] "Generic (PLEG): container finished" podID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerID="2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab" exitCode=0 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859316 4822 generic.go:334] "Generic (PLEG): container finished" podID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerID="78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758" exitCode=143 Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859351 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e155ba38-6cb9-4dda-b10d-f081fc7d284f","Type":"ContainerDied","Data":"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab"} Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859553 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e155ba38-6cb9-4dda-b10d-f081fc7d284f","Type":"ContainerDied","Data":"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758"} Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859679 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e155ba38-6cb9-4dda-b10d-f081fc7d284f","Type":"ContainerDied","Data":"ec6b23b366f01d690c5c2416aa572ba7551515d7b0f10d0d47b08903ac577d77"} Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859436 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.859845 4822 scope.go:117] "RemoveContainer" containerID="2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.891630 4822 scope.go:117] "RemoveContainer" containerID="78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.895348 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.905480 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.924565 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.925325 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerName="init" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.925423 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerName="init" Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.925592 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-api" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.925849 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-api" Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.925980 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerName="dnsmasq-dns" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.926116 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerName="dnsmasq-dns" Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.926228 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-log" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.926326 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-log" Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.926434 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9987828-34f4-4a64-ac85-8c9ee937fe1c" containerName="nova-manage" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.926536 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9987828-34f4-4a64-ac85-8c9ee937fe1c" containerName="nova-manage" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.925728 4822 scope.go:117] "RemoveContainer" containerID="2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.926980 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9987828-34f4-4a64-ac85-8c9ee937fe1c" containerName="nova-manage" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927089 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f0c0de-cff5-4a04-9b80-204c4791f430" containerName="dnsmasq-dns" Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.927154 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab\": container with ID starting with 2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab not found: ID does not exist" containerID="2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927208 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab"} err="failed to get container status \"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab\": rpc error: code = NotFound desc = could not find container \"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab\": container with ID starting with 2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab not found: ID does not exist" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927237 4822 scope.go:117] "RemoveContainer" containerID="78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927167 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-log" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927373 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" containerName="nova-api-api" Oct 10 06:46:19 crc kubenswrapper[4822]: E1010 06:46:19.927640 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758\": container with ID starting with 78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758 not found: ID does not exist" containerID="78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927759 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758"} err="failed to get container status \"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758\": rpc error: code = NotFound desc = could not find container \"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758\": container with ID starting with 78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758 not found: ID does not exist" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.927888 4822 scope.go:117] "RemoveContainer" containerID="2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.928367 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab"} err="failed to get container status \"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab\": rpc error: code = NotFound desc = could not find container \"2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab\": container with ID starting with 2adaf1fff929bedd1242eca233087a9a4c77f31d319149af7655e6a674632aab not found: ID does not exist" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.928410 4822 scope.go:117] "RemoveContainer" containerID="78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.928515 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.929177 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758"} err="failed to get container status \"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758\": rpc error: code = NotFound desc = could not find container \"78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758\": container with ID starting with 78cffc08a19bddb285bc90be84f98adc6f015c6af2b9f3e77ae050a1d37ec758 not found: ID does not exist" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.930501 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.930725 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.933458 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.934383 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.992752 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kjl\" (UniqueName: \"kubernetes.io/projected/c0da5e90-c960-4d67-9c19-6854f61dee14-kube-api-access-79kjl\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.993338 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-config-data\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.993470 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.993629 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.993793 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0da5e90-c960-4d67-9c19-6854f61dee14-logs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:19 crc kubenswrapper[4822]: I1010 06:46:19.994005 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-public-tls-certs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.096577 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.096688 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.096779 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0da5e90-c960-4d67-9c19-6854f61dee14-logs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.096902 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-public-tls-certs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.096992 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kjl\" (UniqueName: \"kubernetes.io/projected/c0da5e90-c960-4d67-9c19-6854f61dee14-kube-api-access-79kjl\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.097231 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-config-data\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.097906 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0da5e90-c960-4d67-9c19-6854f61dee14-logs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.101435 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-config-data\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.101755 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.102489 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.103095 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-public-tls-certs\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.117177 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kjl\" (UniqueName: \"kubernetes.io/projected/c0da5e90-c960-4d67-9c19-6854f61dee14-kube-api-access-79kjl\") pod \"nova-api-0\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.267015 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.725802 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:46:20 crc kubenswrapper[4822]: I1010 06:46:20.869639 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0da5e90-c960-4d67-9c19-6854f61dee14","Type":"ContainerStarted","Data":"e616bfd3426a5248dc7b7c41864dc2d03e683a80dd5bd1aa85eafddc7d8adda0"} Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.665292 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e155ba38-6cb9-4dda-b10d-f081fc7d284f" path="/var/lib/kubelet/pods/e155ba38-6cb9-4dda-b10d-f081fc7d284f/volumes" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.811755 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.882197 4822 generic.go:334] "Generic (PLEG): container finished" podID="f9625956-b34e-47ff-9685-db1f2aef4898" containerID="690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495" exitCode=0 Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.882405 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9625956-b34e-47ff-9685-db1f2aef4898","Type":"ContainerDied","Data":"690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495"} Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.882515 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.882680 4822 scope.go:117] "RemoveContainer" containerID="690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.882665 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9625956-b34e-47ff-9685-db1f2aef4898","Type":"ContainerDied","Data":"1ec46fc2f39c0c0f08b9103d3bc874b67b189053f8880166cbee520fdb2a85e8"} Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.887034 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0da5e90-c960-4d67-9c19-6854f61dee14","Type":"ContainerStarted","Data":"c06e8d4a5f40e2de362c673f9a2482416e108f4360a178ff0fccd0dcaf4c36e9"} Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.887089 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0da5e90-c960-4d67-9c19-6854f61dee14","Type":"ContainerStarted","Data":"3eab14a45cceb25e06794fdc7e4a763b3c3af3f0e97c2df5eb1c6cd89012256a"} Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.915137 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.915117378 podStartE2EDuration="2.915117378s" podCreationTimestamp="2025-10-10 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:21.909724822 +0000 UTC m=+1329.004883058" watchObservedRunningTime="2025-10-10 06:46:21.915117378 +0000 UTC m=+1329.010275574" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.928577 4822 scope.go:117] "RemoveContainer" containerID="690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495" Oct 10 06:46:21 crc kubenswrapper[4822]: E1010 06:46:21.929123 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495\": container with ID starting with 690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495 not found: ID does not exist" containerID="690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.929163 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495"} err="failed to get container status \"690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495\": rpc error: code = NotFound desc = could not find container \"690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495\": container with ID starting with 690a14513c574f97af6fcd5a414e16edd240091ac19ceb7e52e33788d3801495 not found: ID does not exist" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.936308 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7brgc\" (UniqueName: \"kubernetes.io/projected/f9625956-b34e-47ff-9685-db1f2aef4898-kube-api-access-7brgc\") pod \"f9625956-b34e-47ff-9685-db1f2aef4898\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.936474 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-config-data\") pod \"f9625956-b34e-47ff-9685-db1f2aef4898\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.936503 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-combined-ca-bundle\") pod \"f9625956-b34e-47ff-9685-db1f2aef4898\" (UID: \"f9625956-b34e-47ff-9685-db1f2aef4898\") " Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.941512 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9625956-b34e-47ff-9685-db1f2aef4898-kube-api-access-7brgc" (OuterVolumeSpecName: "kube-api-access-7brgc") pod "f9625956-b34e-47ff-9685-db1f2aef4898" (UID: "f9625956-b34e-47ff-9685-db1f2aef4898"). InnerVolumeSpecName "kube-api-access-7brgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.968095 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-config-data" (OuterVolumeSpecName: "config-data") pod "f9625956-b34e-47ff-9685-db1f2aef4898" (UID: "f9625956-b34e-47ff-9685-db1f2aef4898"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:21 crc kubenswrapper[4822]: I1010 06:46:21.968166 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9625956-b34e-47ff-9685-db1f2aef4898" (UID: "f9625956-b34e-47ff-9685-db1f2aef4898"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.038677 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.038715 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9625956-b34e-47ff-9685-db1f2aef4898-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.038725 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7brgc\" (UniqueName: \"kubernetes.io/projected/f9625956-b34e-47ff-9685-db1f2aef4898-kube-api-access-7brgc\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.213736 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.221282 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.232354 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:55662->10.217.0.193:8775: read: connection reset by peer" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.232484 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:55654->10.217.0.193:8775: read: connection reset by peer" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.244712 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: E1010 06:46:22.245197 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9625956-b34e-47ff-9685-db1f2aef4898" containerName="nova-scheduler-scheduler" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.245215 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9625956-b34e-47ff-9685-db1f2aef4898" containerName="nova-scheduler-scheduler" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.245396 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9625956-b34e-47ff-9685-db1f2aef4898" containerName="nova-scheduler-scheduler" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.246032 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.249403 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.278673 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.343784 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-config-data\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.343926 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkbjs\" (UniqueName: \"kubernetes.io/projected/ddf4e23c-3df4-4d67-8a61-c97f860aa797-kube-api-access-zkbjs\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.343997 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.445985 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.446095 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-config-data\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.446285 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkbjs\" (UniqueName: \"kubernetes.io/projected/ddf4e23c-3df4-4d67-8a61-c97f860aa797-kube-api-access-zkbjs\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.450775 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.460258 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-config-data\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.463579 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkbjs\" (UniqueName: \"kubernetes.io/projected/ddf4e23c-3df4-4d67-8a61-c97f860aa797-kube-api-access-zkbjs\") pod \"nova-scheduler-0\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.622664 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.680959 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.752365 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-logs\") pod \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.752539 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-nova-metadata-tls-certs\") pod \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.752593 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-combined-ca-bundle\") pod \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.752631 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92dfj\" (UniqueName: \"kubernetes.io/projected/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-kube-api-access-92dfj\") pod \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.752678 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-config-data\") pod \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\" (UID: \"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79\") " Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.753098 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-logs" (OuterVolumeSpecName: "logs") pod "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" (UID: "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.753983 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.758042 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-kube-api-access-92dfj" (OuterVolumeSpecName: "kube-api-access-92dfj") pod "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" (UID: "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79"). InnerVolumeSpecName "kube-api-access-92dfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.784368 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-config-data" (OuterVolumeSpecName: "config-data") pod "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" (UID: "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.789010 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" (UID: "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.815501 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" (UID: "6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.855506 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.855541 4822 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.855551 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.855561 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92dfj\" (UniqueName: \"kubernetes.io/projected/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79-kube-api-access-92dfj\") on node \"crc\" DevicePath \"\"" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.900753 4822 generic.go:334] "Generic (PLEG): container finished" podID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerID="225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4" exitCode=0 Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.900869 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79","Type":"ContainerDied","Data":"225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4"} Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.900907 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79","Type":"ContainerDied","Data":"c6f5be1d7e20090d364ec469b79bfb138fc170695c6c1a67f6223ab0f4fc7abc"} Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.900930 4822 scope.go:117] "RemoveContainer" containerID="225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.901059 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.941665 4822 scope.go:117] "RemoveContainer" containerID="a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.947093 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.960389 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.971830 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: E1010 06:46:22.972276 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-log" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.972294 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-log" Oct 10 06:46:22 crc kubenswrapper[4822]: E1010 06:46:22.972333 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-metadata" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.972339 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-metadata" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.972511 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-metadata" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.972536 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" containerName="nova-metadata-log" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.973522 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.977267 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.977386 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.980161 4822 scope.go:117] "RemoveContainer" containerID="225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4" Oct 10 06:46:22 crc kubenswrapper[4822]: E1010 06:46:22.981388 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4\": container with ID starting with 225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4 not found: ID does not exist" containerID="225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.981419 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4"} err="failed to get container status \"225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4\": rpc error: code = NotFound desc = could not find container \"225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4\": container with ID starting with 225a4ae48a42f8c615ac544a65bf5026457e9f73fda07ac1e3b038770cbbaca4 not found: ID does not exist" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.981441 4822 scope.go:117] "RemoveContainer" containerID="a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.981566 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:46:22 crc kubenswrapper[4822]: E1010 06:46:22.981671 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743\": container with ID starting with a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743 not found: ID does not exist" containerID="a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743" Oct 10 06:46:22 crc kubenswrapper[4822]: I1010 06:46:22.981715 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743"} err="failed to get container status \"a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743\": rpc error: code = NotFound desc = could not find container \"a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743\": container with ID starting with a8d7b97b9e3f2ab218290741fe57c2c2b1316b0d5a2f4d2695a03c3675462743 not found: ID does not exist" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.059285 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.059326 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d11cec-09a9-4adc-9889-cc90f8b983e1-logs\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.059346 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-config-data\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.059524 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.059781 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqrrk\" (UniqueName: \"kubernetes.io/projected/b5d11cec-09a9-4adc-9889-cc90f8b983e1-kube-api-access-kqrrk\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.110186 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:46:23 crc kubenswrapper[4822]: W1010 06:46:23.120193 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf4e23c_3df4_4d67_8a61_c97f860aa797.slice/crio-f56fc2714e27d1f9448430930d11b3f9255f9e797b575f549634fee4c9b67e28 WatchSource:0}: Error finding container f56fc2714e27d1f9448430930d11b3f9255f9e797b575f549634fee4c9b67e28: Status 404 returned error can't find the container with id f56fc2714e27d1f9448430930d11b3f9255f9e797b575f549634fee4c9b67e28 Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.161324 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.161373 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d11cec-09a9-4adc-9889-cc90f8b983e1-logs\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.161400 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-config-data\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.161446 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.161523 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqrrk\" (UniqueName: \"kubernetes.io/projected/b5d11cec-09a9-4adc-9889-cc90f8b983e1-kube-api-access-kqrrk\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.161989 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d11cec-09a9-4adc-9889-cc90f8b983e1-logs\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.165851 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.166330 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-config-data\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.166453 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.198506 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqrrk\" (UniqueName: \"kubernetes.io/projected/b5d11cec-09a9-4adc-9889-cc90f8b983e1-kube-api-access-kqrrk\") pod \"nova-metadata-0\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.290449 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.663629 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79" path="/var/lib/kubelet/pods/6f4e7a6a-d6b8-4c0d-ab35-4d0396a33c79/volumes" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.664724 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9625956-b34e-47ff-9685-db1f2aef4898" path="/var/lib/kubelet/pods/f9625956-b34e-47ff-9685-db1f2aef4898/volumes" Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.825486 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.927183 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ddf4e23c-3df4-4d67-8a61-c97f860aa797","Type":"ContainerStarted","Data":"81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0"} Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.927240 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ddf4e23c-3df4-4d67-8a61-c97f860aa797","Type":"ContainerStarted","Data":"f56fc2714e27d1f9448430930d11b3f9255f9e797b575f549634fee4c9b67e28"} Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.930977 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d11cec-09a9-4adc-9889-cc90f8b983e1","Type":"ContainerStarted","Data":"720f81325a5020549ed46f601002f3ce8017d12ce7ba9d5ef0d2fdf3b6e23cb6"} Oct 10 06:46:23 crc kubenswrapper[4822]: I1010 06:46:23.952458 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9524372030000001 podStartE2EDuration="1.952437203s" podCreationTimestamp="2025-10-10 06:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:23.946685007 +0000 UTC m=+1331.041843213" watchObservedRunningTime="2025-10-10 06:46:23.952437203 +0000 UTC m=+1331.047595399" Oct 10 06:46:24 crc kubenswrapper[4822]: I1010 06:46:24.943692 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d11cec-09a9-4adc-9889-cc90f8b983e1","Type":"ContainerStarted","Data":"080914d126025b6e485b39c65b1fd29e1335110a9ec2a7216a930a1f2cc0164e"} Oct 10 06:46:24 crc kubenswrapper[4822]: I1010 06:46:24.943977 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d11cec-09a9-4adc-9889-cc90f8b983e1","Type":"ContainerStarted","Data":"ad3e61286e546aa785f998bc78158faaf28a9227d533f3ce1eeace9ba41ab652"} Oct 10 06:46:24 crc kubenswrapper[4822]: I1010 06:46:24.964376 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.964349181 podStartE2EDuration="2.964349181s" podCreationTimestamp="2025-10-10 06:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:46:24.961552601 +0000 UTC m=+1332.056710857" watchObservedRunningTime="2025-10-10 06:46:24.964349181 +0000 UTC m=+1332.059507397" Oct 10 06:46:27 crc kubenswrapper[4822]: I1010 06:46:27.682474 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 06:46:28 crc kubenswrapper[4822]: I1010 06:46:28.291044 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 06:46:28 crc kubenswrapper[4822]: I1010 06:46:28.291118 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 06:46:30 crc kubenswrapper[4822]: I1010 06:46:30.268878 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 06:46:30 crc kubenswrapper[4822]: I1010 06:46:30.269205 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 06:46:31 crc kubenswrapper[4822]: I1010 06:46:31.289105 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 06:46:31 crc kubenswrapper[4822]: I1010 06:46:31.289115 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 06:46:31 crc kubenswrapper[4822]: I1010 06:46:31.336722 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:46:31 crc kubenswrapper[4822]: I1010 06:46:31.336812 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:46:32 crc kubenswrapper[4822]: I1010 06:46:32.682509 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 06:46:32 crc kubenswrapper[4822]: I1010 06:46:32.716552 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 06:46:33 crc kubenswrapper[4822]: I1010 06:46:33.044151 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 06:46:33 crc kubenswrapper[4822]: I1010 06:46:33.291373 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 06:46:33 crc kubenswrapper[4822]: I1010 06:46:33.291422 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 06:46:34 crc kubenswrapper[4822]: I1010 06:46:34.307128 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 06:46:34 crc kubenswrapper[4822]: I1010 06:46:34.307167 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 06:46:40 crc kubenswrapper[4822]: I1010 06:46:40.276901 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 06:46:40 crc kubenswrapper[4822]: I1010 06:46:40.277675 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 06:46:40 crc kubenswrapper[4822]: I1010 06:46:40.277835 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 06:46:40 crc kubenswrapper[4822]: I1010 06:46:40.290445 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 06:46:41 crc kubenswrapper[4822]: I1010 06:46:41.120906 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 06:46:41 crc kubenswrapper[4822]: I1010 06:46:41.135918 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 06:46:41 crc kubenswrapper[4822]: I1010 06:46:41.167504 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 06:46:43 crc kubenswrapper[4822]: I1010 06:46:43.296905 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 06:46:43 crc kubenswrapper[4822]: I1010 06:46:43.297352 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 06:46:43 crc kubenswrapper[4822]: I1010 06:46:43.503551 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 06:46:44 crc kubenswrapper[4822]: I1010 06:46:44.155461 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.337150 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.337668 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.337708 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.338437 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"006f8177d378a019f93f20e514e90a0268748c4dd87f7ee989c03c088b0112a8"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.338493 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://006f8177d378a019f93f20e514e90a0268748c4dd87f7ee989c03c088b0112a8" gracePeriod=600 Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.754868 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.755385 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c8cd3778-5a5a-483a-af22-5b8420ae896b" containerName="openstackclient" containerID="cri-o://37b4d43e434fbc8b371e9636d76d9af8856d96fcc8463cdd3a2a306557a6b929" gracePeriod=2 Oct 10 06:47:01 crc kubenswrapper[4822]: I1010 06:47:01.792297 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.066657 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement16c0-account-delete-dzcpq"] Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.067618 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cd3778-5a5a-483a-af22-5b8420ae896b" containerName="openstackclient" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.067642 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cd3778-5a5a-483a-af22-5b8420ae896b" containerName="openstackclient" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.067896 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cd3778-5a5a-483a-af22-5b8420ae896b" containerName="openstackclient" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.070407 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.093724 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement16c0-account-delete-dzcpq"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.139291 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/7aa25f12-05b2-4632-921a-d126059a63be-kube-api-access-dq8p7\") pod \"placement16c0-account-delete-dzcpq\" (UID: \"7aa25f12-05b2-4632-921a-d126059a63be\") " pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.258955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/7aa25f12-05b2-4632-921a-d126059a63be-kube-api-access-dq8p7\") pod \"placement16c0-account-delete-dzcpq\" (UID: \"7aa25f12-05b2-4632-921a-d126059a63be\") " pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.275617 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.300279 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/7aa25f12-05b2-4632-921a-d126059a63be-kube-api-access-dq8p7\") pod \"placement16c0-account-delete-dzcpq\" (UID: \"7aa25f12-05b2-4632-921a-d126059a63be\") " pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.353123 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8jr42"] Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.367928 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.367984 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data podName:1fa59157-6b4e-4379-89e0-415e74c581a8 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:02.867969005 +0000 UTC m=+1369.963127201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data") pod "rabbitmq-cell1-server-0" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8") : configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.369242 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="006f8177d378a019f93f20e514e90a0268748c4dd87f7ee989c03c088b0112a8" exitCode=0 Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.369281 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"006f8177d378a019f93f20e514e90a0268748c4dd87f7ee989c03c088b0112a8"} Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.369306 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b"} Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.369321 4822 scope.go:117] "RemoveContainer" containerID="c43ffe7835942c8ad421a26f02d63196eea012c628c464759216b8f8a59f7812" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.381025 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8jr42"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.389869 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder5f70-account-delete-kxtp9"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.391757 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.398271 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.402832 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder5f70-account-delete-kxtp9"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.420744 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-t5dtz"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.434850 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-t5dtz"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.443743 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.444513 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="openstack-network-exporter" containerID="cri-o://c3480d312ca94790932ca6354883bc03b463bd75e6035ae95f7c287a9d29849c" gracePeriod=300 Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.452581 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican9441-account-delete-zxr9g"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.454141 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.470697 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69zwd\" (UniqueName: \"kubernetes.io/projected/dc72727a-70e5-402e-90f1-2c54c48dd5f8-kube-api-access-69zwd\") pod \"cinder5f70-account-delete-kxtp9\" (UID: \"dc72727a-70e5-402e-90f1-2c54c48dd5f8\") " pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.473291 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican9441-account-delete-zxr9g"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.545342 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.572955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69zwd\" (UniqueName: \"kubernetes.io/projected/dc72727a-70e5-402e-90f1-2c54c48dd5f8-kube-api-access-69zwd\") pod \"cinder5f70-account-delete-kxtp9\" (UID: \"dc72727a-70e5-402e-90f1-2c54c48dd5f8\") " pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.573161 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjqz\" (UniqueName: \"kubernetes.io/projected/c07072cb-ae19-4dcb-9f52-432fe923949d-kube-api-access-dtjqz\") pod \"barbican9441-account-delete-zxr9g\" (UID: \"c07072cb-ae19-4dcb-9f52-432fe923949d\") " pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.574544 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.574578 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data podName:48fba34a-0289-41f0-b1d7-bb71a22253a3 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:03.074566597 +0000 UTC m=+1370.169724793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data") pod "rabbitmq-server-0" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3") : configmap "rabbitmq-config-data" not found Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.617577 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69zwd\" (UniqueName: \"kubernetes.io/projected/dc72727a-70e5-402e-90f1-2c54c48dd5f8-kube-api-access-69zwd\") pod \"cinder5f70-account-delete-kxtp9\" (UID: \"dc72727a-70e5-402e-90f1-2c54c48dd5f8\") " pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.628673 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron834d-account-delete-j5ft4"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.633463 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.643913 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron834d-account-delete-j5ft4"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.656731 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="ovsdbserver-sb" containerID="cri-o://85d5df3b347c7c673d52d3c41f938ec4fc9dd982fd6b9414a88fc92d459a23ab" gracePeriod=300 Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.676666 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwpt\" (UniqueName: \"kubernetes.io/projected/52264dc7-4118-484f-ab12-1bfd17172c20-kube-api-access-tnwpt\") pod \"neutron834d-account-delete-j5ft4\" (UID: \"52264dc7-4118-484f-ab12-1bfd17172c20\") " pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.676740 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjqz\" (UniqueName: \"kubernetes.io/projected/c07072cb-ae19-4dcb-9f52-432fe923949d-kube-api-access-dtjqz\") pod \"barbican9441-account-delete-zxr9g\" (UID: \"c07072cb-ae19-4dcb-9f52-432fe923949d\") " pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.696122 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.696723 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="openstack-network-exporter" containerID="cri-o://02cf6c43ff65c5cfd73aa24c2571a632c72706b61bec1beff2c761fecf4a63ac" gracePeriod=300 Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.721382 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-plp78"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.736753 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.754150 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-plp78"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.797835 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjqz\" (UniqueName: \"kubernetes.io/projected/c07072cb-ae19-4dcb-9f52-432fe923949d-kube-api-access-dtjqz\") pod \"barbican9441-account-delete-zxr9g\" (UID: \"c07072cb-ae19-4dcb-9f52-432fe923949d\") " pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.799052 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwpt\" (UniqueName: \"kubernetes.io/projected/52264dc7-4118-484f-ab12-1bfd17172c20-kube-api-access-tnwpt\") pod \"neutron834d-account-delete-j5ft4\" (UID: \"52264dc7-4118-484f-ab12-1bfd17172c20\") " pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.799705 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.799733 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: secret "swift-conf" not found Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.799779 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:47:03.299763624 +0000 UTC m=+1370.394921820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : secret "swift-conf" not found Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.823993 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.824306 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" containerID="cri-o://964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569" gracePeriod=30 Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.824457 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="openstack-network-exporter" containerID="cri-o://a699c14506381f4dd996d947b346eb98c9c53450eafb1392ee285ce66c591795" gracePeriod=30 Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.889459 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwpt\" (UniqueName: \"kubernetes.io/projected/52264dc7-4118-484f-ab12-1bfd17172c20-kube-api-access-tnwpt\") pod \"neutron834d-account-delete-j5ft4\" (UID: \"52264dc7-4118-484f-ab12-1bfd17172c20\") " pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.906319 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:02 crc kubenswrapper[4822]: E1010 06:47:02.906383 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data podName:1fa59157-6b4e-4379-89e0-415e74c581a8 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:03.906366155 +0000 UTC m=+1371.001524351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data") pod "rabbitmq-cell1-server-0" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8") : configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.967291 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qqsbv"] Oct 10 06:47:02 crc kubenswrapper[4822]: I1010 06:47:02.995462 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qqsbv"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.014660 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.112163 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.112229 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data podName:48fba34a-0289-41f0-b1d7-bb71a22253a3 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:04.112214095 +0000 UTC m=+1371.207372291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data") pod "rabbitmq-server-0" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3") : configmap "rabbitmq-config-data" not found Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.132853 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell11f81-account-delete-mz2rx"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.134356 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.134381 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="ovsdbserver-nb" containerID="cri-o://365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7" gracePeriod=300 Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.135616 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.169271 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell11f81-account-delete-mz2rx"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.177962 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" probeResult="failure" output=< Oct 10 06:47:03 crc kubenswrapper[4822]: 2025-10-10T06:47:02Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Oct 10 06:47:03 crc kubenswrapper[4822]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Oct 10 06:47:03 crc kubenswrapper[4822]: 2025-10-10T06:47:03Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Oct 10 06:47:03 crc kubenswrapper[4822]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Oct 10 06:47:03 crc kubenswrapper[4822]: > Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.183373 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.193928 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.198633 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.198692 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="ovsdbserver-nb" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.209598 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fc97c446d-qd577"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.217980 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fc97c446d-qd577" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-log" containerID="cri-o://f37aae3b6d57636153fe7116fe10389d8dc0009af7d1ea6600168de93249a807" gracePeriod=30 Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.218178 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fc97c446d-qd577" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-api" containerID="cri-o://a1f35f0eceeefdb49f84dae81b99596158510c253b94b22659bccc55d2420d00" gracePeriod=30 Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.219515 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srsc\" (UniqueName: \"kubernetes.io/projected/6c2b7a8e-ab63-4d56-929e-1e6898294956-kube-api-access-4srsc\") pod \"novacell11f81-account-delete-mz2rx\" (UID: \"6c2b7a8e-ab63-4d56-929e-1e6898294956\") " pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.222760 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell07142-account-delete-jddpj"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.225575 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.255260 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell07142-account-delete-jddpj"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.288999 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5jdbx"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.322142 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbsf\" (UniqueName: \"kubernetes.io/projected/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86-kube-api-access-fwbsf\") pod \"novacell07142-account-delete-jddpj\" (UID: \"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86\") " pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.322237 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srsc\" (UniqueName: \"kubernetes.io/projected/6c2b7a8e-ab63-4d56-929e-1e6898294956-kube-api-access-4srsc\") pod \"novacell11f81-account-delete-mz2rx\" (UID: \"6c2b7a8e-ab63-4d56-929e-1e6898294956\") " pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.322756 4822 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.322784 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.322818 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.322835 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:03 crc kubenswrapper[4822]: E1010 06:47:03.322881 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:47:04.322864893 +0000 UTC m=+1371.418023089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.342467 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6l7hn"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.342746 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6l7hn" podUID="241b1f65-5edb-4965-b9af-e8e12b73124c" containerName="openstack-network-exporter" containerID="cri-o://80e86c1e4ef3bdd1b551d89f9e5f8eac8c6d612c80b88f1c7b4a89756edf38ef" gracePeriod=30 Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.352160 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapibdb6-account-delete-qgrcb"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.371004 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.389076 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-nrgt7"] Oct 10 06:47:03 crc kubenswrapper[4822]: I1010 06:47:03.399741 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapibdb6-account-delete-qgrcb"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.444089 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srsc\" (UniqueName: \"kubernetes.io/projected/6c2b7a8e-ab63-4d56-929e-1e6898294956-kube-api-access-4srsc\") pod \"novacell11f81-account-delete-mz2rx\" (UID: \"6c2b7a8e-ab63-4d56-929e-1e6898294956\") " pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.456492 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbsf\" (UniqueName: \"kubernetes.io/projected/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86-kube-api-access-fwbsf\") pod \"novacell07142-account-delete-jddpj\" (UID: \"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86\") " pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.472144 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.482414 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="cinder-scheduler" containerID="cri-o://efce444dc287af620565305af2a58baca7c83295b3bd9c0cfb54af7d64975eef" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.482903 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="probe" containerID="cri-o://6bab53c5a6f89332c0aa8004dfd4394eade34796f5bf17f77ba2d71af6be542f" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.497599 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.499611 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbsf\" (UniqueName: \"kubernetes.io/projected/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86-kube-api-access-fwbsf\") pod \"novacell07142-account-delete-jddpj\" (UID: \"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86\") " pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.525079 4822 generic.go:334] "Generic (PLEG): container finished" podID="22236db0-c666-44e4-a290-66626e76cdad" containerID="c3480d312ca94790932ca6354883bc03b463bd75e6035ae95f7c287a9d29849c" exitCode=2 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.525194 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22236db0-c666-44e4-a290-66626e76cdad","Type":"ContainerDied","Data":"c3480d312ca94790932ca6354883bc03b463bd75e6035ae95f7c287a9d29849c"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.591517 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g77z\" (UniqueName: \"kubernetes.io/projected/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9-kube-api-access-5g77z\") pod \"novaapibdb6-account-delete-qgrcb\" (UID: \"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9\") " pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.619662 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.699446 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g77z\" (UniqueName: \"kubernetes.io/projected/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9-kube-api-access-5g77z\") pod \"novaapibdb6-account-delete-qgrcb\" (UID: \"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9\") " pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.700157 4822 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/glance-default-external-api-0" secret="" err="secret \"glance-glance-dockercfg-twbbx\" not found" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.723189 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3156ed-785b-492d-923f-cbd97a996b43" path="/var/lib/kubelet/pods/3d3156ed-785b-492d-923f-cbd97a996b43/volumes" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.725964 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b14558-f019-4f51-a3ab-b5689de6336a" path="/var/lib/kubelet/pods/41b14558-f019-4f51-a3ab-b5689de6336a/volumes" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.727098 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1de65e8-9721-4039-8c67-9fbb0d715693" path="/var/lib/kubelet/pods/a1de65e8-9721-4039-8c67-9fbb0d715693/volumes" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.727747 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82a5873-895d-4f27-ab6b-c264a59949a6" path="/var/lib/kubelet/pods/c82a5873-895d-4f27-ab6b-c264a59949a6/volumes" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.728407 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hzr4f"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.728434 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hzr4f"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.729034 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.729305 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api-log" containerID="cri-o://f83cde6f4a70de9b355f7b282554b988566370270eca0205f68d4daaaf187345" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.729623 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api" containerID="cri-o://5768417b026ddc15a2a8c2d6da91a0f1f8ec8b7c89708cc85cf21fe86db42db9" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.735476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g77z\" (UniqueName: \"kubernetes.io/projected/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9-kube-api-access-5g77z\") pod \"novaapibdb6-account-delete-qgrcb\" (UID: \"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9\") " pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.765212 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ktcrj"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.774213 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" probeResult="failure" output="command timed out" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.796046 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9h2hz"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.796266 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerName="dnsmasq-dns" containerID="cri-o://203a642bf3c3f9b4af40226e7974ac31a360eac399b610e485b7214df187d558" gracePeriod=10 Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:03.805140 4822 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:03.805193 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:04.305176498 +0000 UTC m=+1371.400334694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-default-external-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:03.805856 4822 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:03.805913 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:04.305895288 +0000 UTC m=+1371.401053564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-scripts" not found Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.836340 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ktcrj"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.846486 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement16c0-account-delete-dzcpq"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.860053 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cshfh"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.873397 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cshfh"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.887261 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c7474d4d9-hl56q"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.887559 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c7474d4d9-hl56q" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-api" containerID="cri-o://1dd0282c32b952bc28426c99bc939d6cd5556dd9df6f57d786cbccb734a3f9db" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.888076 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c7474d4d9-hl56q" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-httpd" containerID="cri-o://a63c8d6a853e7bf676f1d100b90459d644b828358cb3a287d4e0e0565d941778" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.924928 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5m8q5"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.954504 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:03.955084 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-5m8q5"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.002078 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.002523 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-server" containerID="cri-o://e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.002969 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="swift-recon-cron" containerID="cri-o://c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003011 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="rsync" containerID="cri-o://67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003042 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-expirer" containerID="cri-o://13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003070 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-updater" containerID="cri-o://11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003095 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-auditor" containerID="cri-o://cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003122 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-replicator" containerID="cri-o://e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003149 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-server" containerID="cri-o://0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003177 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-updater" containerID="cri-o://b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003205 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-auditor" containerID="cri-o://7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003238 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-replicator" containerID="cri-o://4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003279 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-server" containerID="cri-o://04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003309 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-reaper" containerID="cri-o://ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003335 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-auditor" containerID="cri-o://19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.003366 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-replicator" containerID="cri-o://3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.012846 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gm7d6"] Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.023638 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.023709 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data podName:1fa59157-6b4e-4379-89e0-415e74c581a8 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:06.023691673 +0000 UTC m=+1373.118849869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data") pod "rabbitmq-cell1-server-0" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8") : configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.066685 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gm7d6"] Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.129101 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.129168 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data podName:48fba34a-0289-41f0-b1d7-bb71a22253a3 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:06.12914631 +0000 UTC m=+1373.224304506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data") pod "rabbitmq-server-0" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3") : configmap "rabbitmq-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.129607 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-16c0-account-create-x24p8"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.149943 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-16c0-account-create-x24p8"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.236921 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement16c0-account-delete-dzcpq"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.276852 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.326165 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ntwrt"] Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328545 4822 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328594 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:05.328579966 +0000 UTC m=+1372.423738162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-default-external-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328642 4822 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328657 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328669 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328681 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328707 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:47:06.328698729 +0000 UTC m=+1373.423856925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328743 4822 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.328761 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:05.328755501 +0000 UTC m=+1372.423913687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-scripts" not found Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.368949 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-w2kh5"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.373949 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5f70-account-create-4cjwb"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.380833 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ntwrt"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.401512 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5f70-account-create-4cjwb"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.408623 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder5f70-account-delete-kxtp9"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.416955 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="rabbitmq" containerID="cri-o://a829a2721fe99b524e5ca7cb2318d1332bb2968f94f80cd142fd5a74891aa843" gracePeriod=604800 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.428366 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-w2kh5"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.436892 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9441-account-create-jrgnn"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.449222 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9441-account-create-jrgnn"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.456668 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican9441-account-delete-zxr9g"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.470981 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.498696 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mz7mk"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.512177 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mz7mk"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.589335 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d814-account-create-src5c"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.599340 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d814-account-create-src5c"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.616898 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.617187 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-log" containerID="cri-o://ff627b4112d5afa4366e1239ee8cc64edf816c0effeb7dafc7e69ae5cc1194ec" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.617363 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-httpd" containerID="cri-o://80843954eb6d7ec548593d47ef32a3ec88c616382899f1b6da51f373178452d9" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.636501 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" containerID="cri-o://84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" gracePeriod=29 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.640244 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ngq5g"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.653841 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ngq5g"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.694613 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron834d-account-delete-j5ft4"] Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.707602 4822 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 10 06:47:04 crc kubenswrapper[4822]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 10 06:47:04 crc kubenswrapper[4822]: + source /usr/local/bin/container-scripts/functions Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNBridge=br-int Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNRemote=tcp:localhost:6642 Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNEncapType=geneve Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNAvailabilityZones= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ EnableChassisAsGateway=true Oct 10 06:47:04 crc kubenswrapper[4822]: ++ PhysicalNetworks= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNHostName= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 10 06:47:04 crc kubenswrapper[4822]: ++ ovs_dir=/var/lib/openvswitch Oct 10 06:47:04 crc kubenswrapper[4822]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 10 06:47:04 crc kubenswrapper[4822]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 10 06:47:04 crc kubenswrapper[4822]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + sleep 0.5 Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + sleep 0.5 Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + cleanup_ovsdb_server_semaphore Oct 10 06:47:04 crc kubenswrapper[4822]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 06:47:04 crc kubenswrapper[4822]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 10 06:47:04 crc kubenswrapper[4822]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-nrgt7" message=< Oct 10 06:47:04 crc kubenswrapper[4822]: Exiting ovsdb-server (5) [ OK ] Oct 10 06:47:04 crc kubenswrapper[4822]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 10 06:47:04 crc kubenswrapper[4822]: + source /usr/local/bin/container-scripts/functions Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNBridge=br-int Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNRemote=tcp:localhost:6642 Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNEncapType=geneve Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNAvailabilityZones= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ EnableChassisAsGateway=true Oct 10 06:47:04 crc kubenswrapper[4822]: ++ PhysicalNetworks= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNHostName= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 10 06:47:04 crc kubenswrapper[4822]: ++ ovs_dir=/var/lib/openvswitch Oct 10 06:47:04 crc kubenswrapper[4822]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 10 06:47:04 crc kubenswrapper[4822]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 10 06:47:04 crc kubenswrapper[4822]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + sleep 0.5 Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + sleep 0.5 Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + cleanup_ovsdb_server_semaphore Oct 10 06:47:04 crc kubenswrapper[4822]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 06:47:04 crc kubenswrapper[4822]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 10 06:47:04 crc kubenswrapper[4822]: > Oct 10 06:47:04 crc kubenswrapper[4822]: E1010 06:47:04.707946 4822 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 10 06:47:04 crc kubenswrapper[4822]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 10 06:47:04 crc kubenswrapper[4822]: + source /usr/local/bin/container-scripts/functions Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNBridge=br-int Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNRemote=tcp:localhost:6642 Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNEncapType=geneve Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNAvailabilityZones= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ EnableChassisAsGateway=true Oct 10 06:47:04 crc kubenswrapper[4822]: ++ PhysicalNetworks= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ OVNHostName= Oct 10 06:47:04 crc kubenswrapper[4822]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 10 06:47:04 crc kubenswrapper[4822]: ++ ovs_dir=/var/lib/openvswitch Oct 10 06:47:04 crc kubenswrapper[4822]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 10 06:47:04 crc kubenswrapper[4822]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 10 06:47:04 crc kubenswrapper[4822]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + sleep 0.5 Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + sleep 0.5 Oct 10 06:47:04 crc kubenswrapper[4822]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 06:47:04 crc kubenswrapper[4822]: + cleanup_ovsdb_server_semaphore Oct 10 06:47:04 crc kubenswrapper[4822]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 06:47:04 crc kubenswrapper[4822]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 10 06:47:04 crc kubenswrapper[4822]: > pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" containerID="cri-o://3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.707732 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-834d-account-create-qlz2z"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.707986 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" containerID="cri-o://3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" gracePeriod=29 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.730533 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-766cf74578-rdxjc"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.730838 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener-log" containerID="cri-o://72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.731254 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener" containerID="cri-o://1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.754911 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-834d-account-create-qlz2z"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.777007 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b5444654f-5wp86"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.777298 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b5444654f-5wp86" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker-log" containerID="cri-o://a1174022bfa90fdbbc7bdb6448ded60ea8ea9a3effbb5e0c5631a9b1f7bfe1e1" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.777426 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b5444654f-5wp86" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker" containerID="cri-o://fda58ea5f9b88b81e2eae62a1675670199fddb3e9d024a70d2a8d75abe7fbe9f" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.790681 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.805993 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.809130 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_108483bc-0a52-4ac2-8086-fa89466ea3aa/ovsdbserver-nb/0.log" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.809238 4822 generic.go:334] "Generic (PLEG): container finished" podID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerID="02cf6c43ff65c5cfd73aa24c2571a632c72706b61bec1beff2c761fecf4a63ac" exitCode=2 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.809255 4822 generic.go:334] "Generic (PLEG): container finished" podID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerID="365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7" exitCode=143 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.810113 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"108483bc-0a52-4ac2-8086-fa89466ea3aa","Type":"ContainerDied","Data":"02cf6c43ff65c5cfd73aa24c2571a632c72706b61bec1beff2c761fecf4a63ac"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.810173 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"108483bc-0a52-4ac2-8086-fa89466ea3aa","Type":"ContainerDied","Data":"365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.818457 4822 generic.go:334] "Generic (PLEG): container finished" podID="7a076d47-5de3-4eba-a933-265448eb8a11" containerID="a63c8d6a853e7bf676f1d100b90459d644b828358cb3a287d4e0e0565d941778" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.818521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7474d4d9-hl56q" event={"ID":"7a076d47-5de3-4eba-a933-265448eb8a11","Type":"ContainerDied","Data":"a63c8d6a853e7bf676f1d100b90459d644b828358cb3a287d4e0e0565d941778"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.820941 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.821231 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-log" containerID="cri-o://3eab14a45cceb25e06794fdc7e4a763b3c3af3f0e97c2df5eb1c6cd89012256a" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.821501 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-api" containerID="cri-o://c06e8d4a5f40e2de362c673f9a2482416e108f4360a178ff0fccd0dcaf4c36e9" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.833545 4822 generic.go:334] "Generic (PLEG): container finished" podID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerID="203a642bf3c3f9b4af40226e7974ac31a360eac399b610e485b7214df187d558" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.833632 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" event={"ID":"999b3a9f-9559-4baa-9f36-4f91631fb1fc","Type":"ContainerDied","Data":"203a642bf3c3f9b4af40226e7974ac31a360eac399b610e485b7214df187d558"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.835489 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6b5485c95f-w8q56"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.835687 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6b5485c95f-w8q56" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-httpd" containerID="cri-o://065d149d1b211170467b39093535c955aae1f107db060308f9576b5afffb25a6" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.836204 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6b5485c95f-w8q56" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-server" containerID="cri-o://d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.858732 4822 generic.go:334] "Generic (PLEG): container finished" podID="7aa25f12-05b2-4632-921a-d126059a63be" containerID="7f58aba361d15a267262f8637de2ab720390632cc57a89026f070c3fadcc16d0" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.859614 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c8fcfdd4-4gk9s"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.859650 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement16c0-account-delete-dzcpq" event={"ID":"7aa25f12-05b2-4632-921a-d126059a63be","Type":"ContainerDied","Data":"7f58aba361d15a267262f8637de2ab720390632cc57a89026f070c3fadcc16d0"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.859671 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement16c0-account-delete-dzcpq" event={"ID":"7aa25f12-05b2-4632-921a-d126059a63be","Type":"ContainerStarted","Data":"5f17bd753149822470d3379e7b6816046320eaa0d3cff344e216fd08311e7f48"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.859894 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api-log" containerID="cri-o://489b35ea36a581d0ca50df223a61ef6883583aa6d15472a109b5513aa80e1f47" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.860231 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api" containerID="cri-o://d17675142c2329604a137158115089b304e6ce04d8ec7938439c69f02db6cd14" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.878041 4822 generic.go:334] "Generic (PLEG): container finished" podID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerID="f37aae3b6d57636153fe7116fe10389d8dc0009af7d1ea6600168de93249a807" exitCode=143 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.878151 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc97c446d-qd577" event={"ID":"3420c1f4-bf0d-4de6-90a4-c00e0722d911","Type":"ContainerDied","Data":"f37aae3b6d57636153fe7116fe10389d8dc0009af7d1ea6600168de93249a807"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.892451 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6l7hn_241b1f65-5edb-4965-b9af-e8e12b73124c/openstack-network-exporter/0.log" Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.892501 4822 generic.go:334] "Generic (PLEG): container finished" podID="241b1f65-5edb-4965-b9af-e8e12b73124c" containerID="80e86c1e4ef3bdd1b551d89f9e5f8eac8c6d612c80b88f1c7b4a89756edf38ef" exitCode=2 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.892561 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6l7hn" event={"ID":"241b1f65-5edb-4965-b9af-e8e12b73124c","Type":"ContainerDied","Data":"80e86c1e4ef3bdd1b551d89f9e5f8eac8c6d612c80b88f1c7b4a89756edf38ef"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.900158 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.900417 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-log" containerID="cri-o://ad3e61286e546aa785f998bc78158faaf28a9227d533f3ce1eeace9ba41ab652" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.900828 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-metadata" containerID="cri-o://080914d126025b6e485b39c65b1fd29e1335110a9ec2a7216a930a1f2cc0164e" gracePeriod=30 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.914743 4822 generic.go:334] "Generic (PLEG): container finished" podID="c8cd3778-5a5a-483a-af22-5b8420ae896b" containerID="37b4d43e434fbc8b371e9636d76d9af8856d96fcc8463cdd3a2a306557a6b929" exitCode=137 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.920918 4822 generic.go:334] "Generic (PLEG): container finished" podID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerID="a699c14506381f4dd996d947b346eb98c9c53450eafb1392ee285ce66c591795" exitCode=2 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.920987 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f48ef71-8ab0-4ed4-a58c-78046ec184b6","Type":"ContainerDied","Data":"a699c14506381f4dd996d947b346eb98c9c53450eafb1392ee285ce66c591795"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.934887 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-v88zt"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.945198 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="rabbitmq" containerID="cri-o://65212beb044396c22b0fae65dc35f02089f6f4279e19167d0de48c69116a6853" gracePeriod=604800 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.953973 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1f81-account-create-66lbl"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.963140 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1f81-account-create-66lbl"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.980466 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell11f81-account-delete-mz2rx"] Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983639 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983665 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983672 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983679 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983685 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983691 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983696 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983702 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983709 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983715 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983721 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983727 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983735 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983743 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3" exitCode=0 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983779 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983818 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983828 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983837 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983845 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983854 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983863 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983872 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983889 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983898 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983908 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983916 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.983928 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.990031 4822 generic.go:334] "Generic (PLEG): container finished" podID="f0da7840-eaa9-46a7-bda6-5de928993572" containerID="f83cde6f4a70de9b355f7b282554b988566370270eca0205f68d4daaaf187345" exitCode=143 Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.990095 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0da7840-eaa9-46a7-bda6-5de928993572","Type":"ContainerDied","Data":"f83cde6f4a70de9b355f7b282554b988566370270eca0205f68d4daaaf187345"} Oct 10 06:47:04 crc kubenswrapper[4822]: I1010 06:47:04.990951 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-v88zt"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.008288 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.008711 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="419c8ee7-56fd-43cc-86de-7f647c708502" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://54d693b162a1803c2583c5834b198cb4544f2c4815b65ef898c0cea5e7667146" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.016753 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.016992 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" containerName="nova-scheduler-scheduler" containerID="cri-o://81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.022129 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_22236db0-c666-44e4-a290-66626e76cdad/ovsdbserver-sb/0.log" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.022187 4822 generic.go:334] "Generic (PLEG): container finished" podID="22236db0-c666-44e4-a290-66626e76cdad" containerID="85d5df3b347c7c673d52d3c41f938ec4fc9dd982fd6b9414a88fc92d459a23ab" exitCode=143 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.022410 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-log" containerID="cri-o://9baba77c235fa2da53a1933646bcf77963daf0d1bceccb6370d40760af07bd34" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.022528 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22236db0-c666-44e4-a290-66626e76cdad","Type":"ContainerDied","Data":"85d5df3b347c7c673d52d3c41f938ec4fc9dd982fd6b9414a88fc92d459a23ab"} Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.022892 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-httpd" containerID="cri-o://0f9a24b9406d92324b02f103bd4f78da68951fba025fb39c169f20c3b804f0f8" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.028117 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p9vff"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.075052 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.075361 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerName="nova-cell1-conductor-conductor" containerID="cri-o://6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.105499 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p9vff"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.130862 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.131095 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="09b24550-0f5f-46ff-bf11-192fa1f15650" containerName="nova-cell0-conductor-conductor" containerID="cri-o://136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.132789 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerName="galera" containerID="cri-o://cea6be947bf52e2474312230db60ef4f19b18c57064bc9913aa1efa9a6406d53" gracePeriod=30 Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.139496 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jk5j7"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.174985 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jk5j7"] Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.227436 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.238665 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.310661 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="419c8ee7-56fd-43cc-86de-7f647c708502" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.197:6080/vnc_lite.html\": dial tcp 10.217.0.197:6080: connect: connection refused" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.316060 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_108483bc-0a52-4ac2-8086-fa89466ea3aa/ovsdbserver-nb/0.log" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.316163 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.333240 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6l7hn_241b1f65-5edb-4965-b9af-e8e12b73124c/openstack-network-exporter/0.log" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.333328 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.388153 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_22236db0-c666-44e4-a290-66626e76cdad/ovsdbserver-sb/0.log" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.388228 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396287 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9q5z\" (UniqueName: \"kubernetes.io/projected/c8cd3778-5a5a-483a-af22-5b8420ae896b-kube-api-access-n9q5z\") pod \"c8cd3778-5a5a-483a-af22-5b8420ae896b\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396358 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-sb\") pod \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396395 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-svc\") pod \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396439 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-nb\") pod \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396504 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-combined-ca-bundle\") pod \"c8cd3778-5a5a-483a-af22-5b8420ae896b\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396544 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config\") pod \"c8cd3778-5a5a-483a-af22-5b8420ae896b\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396601 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fjfj\" (UniqueName: \"kubernetes.io/projected/999b3a9f-9559-4baa-9f36-4f91631fb1fc-kube-api-access-5fjfj\") pod \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396654 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-config\") pod \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396717 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config-secret\") pod \"c8cd3778-5a5a-483a-af22-5b8420ae896b\" (UID: \"c8cd3778-5a5a-483a-af22-5b8420ae896b\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.396778 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-swift-storage-0\") pod \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\" (UID: \"999b3a9f-9559-4baa-9f36-4f91631fb1fc\") " Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.397539 4822 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.402162 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:07.402133181 +0000 UTC m=+1374.497291377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-default-external-config-data" not found Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.403547 4822 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.403618 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:07.403597363 +0000 UTC m=+1374.498755559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-scripts" not found Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.428449 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cd3778-5a5a-483a-af22-5b8420ae896b-kube-api-access-n9q5z" (OuterVolumeSpecName: "kube-api-access-n9q5z") pod "c8cd3778-5a5a-483a-af22-5b8420ae896b" (UID: "c8cd3778-5a5a-483a-af22-5b8420ae896b"). InnerVolumeSpecName "kube-api-access-n9q5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.444284 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/999b3a9f-9559-4baa-9f36-4f91631fb1fc-kube-api-access-5fjfj" (OuterVolumeSpecName: "kube-api-access-5fjfj") pod "999b3a9f-9559-4baa-9f36-4f91631fb1fc" (UID: "999b3a9f-9559-4baa-9f36-4f91631fb1fc"). InnerVolumeSpecName "kube-api-access-5fjfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.488892 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8cd3778-5a5a-483a-af22-5b8420ae896b" (UID: "c8cd3778-5a5a-483a-af22-5b8420ae896b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498322 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdgdz\" (UniqueName: \"kubernetes.io/projected/22236db0-c666-44e4-a290-66626e76cdad-kube-api-access-rdgdz\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498391 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-combined-ca-bundle\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498429 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-config\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498530 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-ovsdbserver-sb-tls-certs\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498561 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-metrics-certs-tls-certs\") pod \"241b1f65-5edb-4965-b9af-e8e12b73124c\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498582 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovn-rundir\") pod \"241b1f65-5edb-4965-b9af-e8e12b73124c\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498704 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct98x\" (UniqueName: \"kubernetes.io/projected/241b1f65-5edb-4965-b9af-e8e12b73124c-kube-api-access-ct98x\") pod \"241b1f65-5edb-4965-b9af-e8e12b73124c\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498740 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-scripts\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498829 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdbserver-nb-tls-certs\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498855 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovs-rundir\") pod \"241b1f65-5edb-4965-b9af-e8e12b73124c\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498877 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-metrics-certs-tls-certs\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498928 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.498954 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499020 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdb-rundir\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499092 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-scripts\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499287 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-metrics-certs-tls-certs\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499377 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241b1f65-5edb-4965-b9af-e8e12b73124c-config\") pod \"241b1f65-5edb-4965-b9af-e8e12b73124c\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499439 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bq6b\" (UniqueName: \"kubernetes.io/projected/108483bc-0a52-4ac2-8086-fa89466ea3aa-kube-api-access-5bq6b\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499463 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22236db0-c666-44e4-a290-66626e76cdad-ovsdb-rundir\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499503 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-combined-ca-bundle\") pod \"108483bc-0a52-4ac2-8086-fa89466ea3aa\" (UID: \"108483bc-0a52-4ac2-8086-fa89466ea3aa\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499558 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-combined-ca-bundle\") pod \"241b1f65-5edb-4965-b9af-e8e12b73124c\" (UID: \"241b1f65-5edb-4965-b9af-e8e12b73124c\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.499598 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-config\") pod \"22236db0-c666-44e4-a290-66626e76cdad\" (UID: \"22236db0-c666-44e4-a290-66626e76cdad\") " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.500334 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fjfj\" (UniqueName: \"kubernetes.io/projected/999b3a9f-9559-4baa-9f36-4f91631fb1fc-kube-api-access-5fjfj\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.500358 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9q5z\" (UniqueName: \"kubernetes.io/projected/c8cd3778-5a5a-483a-af22-5b8420ae896b-kube-api-access-n9q5z\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.500370 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.500526 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.501146 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-config" (OuterVolumeSpecName: "config") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.501856 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22236db0-c666-44e4-a290-66626e76cdad-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.502198 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "241b1f65-5edb-4965-b9af-e8e12b73124c" (UID: "241b1f65-5edb-4965-b9af-e8e12b73124c"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.502352 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "241b1f65-5edb-4965-b9af-e8e12b73124c" (UID: "241b1f65-5edb-4965-b9af-e8e12b73124c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.503093 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-config" (OuterVolumeSpecName: "config") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.505209 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-scripts" (OuterVolumeSpecName: "scripts") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.505245 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241b1f65-5edb-4965-b9af-e8e12b73124c-config" (OuterVolumeSpecName: "config") pod "241b1f65-5edb-4965-b9af-e8e12b73124c" (UID: "241b1f65-5edb-4965-b9af-e8e12b73124c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.510087 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-scripts" (OuterVolumeSpecName: "scripts") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.522265 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.523571 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241b1f65-5edb-4965-b9af-e8e12b73124c-kube-api-access-ct98x" (OuterVolumeSpecName: "kube-api-access-ct98x") pod "241b1f65-5edb-4965-b9af-e8e12b73124c" (UID: "241b1f65-5edb-4965-b9af-e8e12b73124c"). InnerVolumeSpecName "kube-api-access-ct98x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.527919 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.549082 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108483bc-0a52-4ac2-8086-fa89466ea3aa-kube-api-access-5bq6b" (OuterVolumeSpecName: "kube-api-access-5bq6b") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "kube-api-access-5bq6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.571956 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22236db0-c666-44e4-a290-66626e76cdad-kube-api-access-rdgdz" (OuterVolumeSpecName: "kube-api-access-rdgdz") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "kube-api-access-rdgdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605294 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605327 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct98x\" (UniqueName: \"kubernetes.io/projected/241b1f65-5edb-4965-b9af-e8e12b73124c-kube-api-access-ct98x\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605344 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605357 4822 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/241b1f65-5edb-4965-b9af-e8e12b73124c-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605411 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605430 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605442 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605454 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605466 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bq6b\" (UniqueName: \"kubernetes.io/projected/108483bc-0a52-4ac2-8086-fa89466ea3aa-kube-api-access-5bq6b\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605478 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22236db0-c666-44e4-a290-66626e76cdad-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605489 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241b1f65-5edb-4965-b9af-e8e12b73124c-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605499 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22236db0-c666-44e4-a290-66626e76cdad-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605510 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdgdz\" (UniqueName: \"kubernetes.io/projected/22236db0-c666-44e4-a290-66626e76cdad-kube-api-access-rdgdz\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.605521 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/108483bc-0a52-4ac2-8086-fa89466ea3aa-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.638459 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c8cd3778-5a5a-483a-af22-5b8420ae896b" (UID: "c8cd3778-5a5a-483a-af22-5b8420ae896b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.688165 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1006eabf-bc60-449e-9650-5f2e2969f08c" path="/var/lib/kubelet/pods/1006eabf-bc60-449e-9650-5f2e2969f08c/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.689435 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a6f02a-7437-4e2b-8e89-319f47afc92f" path="/var/lib/kubelet/pods/13a6f02a-7437-4e2b-8e89-319f47afc92f/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.690067 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25544312-5e94-49ce-8726-8259264add47" path="/var/lib/kubelet/pods/25544312-5e94-49ce-8726-8259264add47/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.694001 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3" path="/var/lib/kubelet/pods/2a8d07a7-5a69-4274-bc4a-f6a3cb24d6c3/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.694721 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37631965-5ba5-48e5-93a4-b5caa45ac6e5" path="/var/lib/kubelet/pods/37631965-5ba5-48e5-93a4-b5caa45ac6e5/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.696346 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b05baa2-bc07-4801-a944-30ee51acd6c5" path="/var/lib/kubelet/pods/3b05baa2-bc07-4801-a944-30ee51acd6c5/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.701864 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa" path="/var/lib/kubelet/pods/41f3ff0b-a9cb-49ce-a6e6-08bdcf49fdaa/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.702644 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4608adcf-203f-4314-a574-18318c584d21" path="/var/lib/kubelet/pods/4608adcf-203f-4314-a574-18318c584d21/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.703495 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc70e85-31c3-40fd-97ca-3522817405ad" path="/var/lib/kubelet/pods/4bc70e85-31c3-40fd-97ca-3522817405ad/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.704630 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f6d2357-fbc8-4a1e-92af-1da00c2d7b89" path="/var/lib/kubelet/pods/5f6d2357-fbc8-4a1e-92af-1da00c2d7b89/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.705988 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae86b10-5f5b-4f05-a5d1-b39b3334ea89" path="/var/lib/kubelet/pods/8ae86b10-5f5b-4f05-a5d1-b39b3334ea89/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.706624 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9d6eca-faa2-4f53-978f-a547fd3fc131" path="/var/lib/kubelet/pods/9a9d6eca-faa2-4f53-978f-a547fd3fc131/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.707321 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f56185-1fd1-420a-9634-e982d9644d21" path="/var/lib/kubelet/pods/a9f56185-1fd1-420a-9634-e982d9644d21/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.707447 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.722214 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8447229-0ca7-47ad-9404-c84da379670f" path="/var/lib/kubelet/pods/b8447229-0ca7-47ad-9404-c84da379670f/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.743311 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.746597 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.756369 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.756446 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerName="nova-cell1-conductor-conductor" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.767148 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9987828-34f4-4a64-ac85-8c9ee937fe1c" path="/var/lib/kubelet/pods/b9987828-34f4-4a64-ac85-8c9ee937fe1c/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.779636 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a3f897-e51e-4ecf-b173-e25d2f000a07" path="/var/lib/kubelet/pods/c6a3f897-e51e-4ecf-b173-e25d2f000a07/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.780387 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea655c41-9403-41cc-9bd1-e362ee5af607" path="/var/lib/kubelet/pods/ea655c41-9403-41cc-9bd1-e362ee5af607/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.781201 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfe8139-a900-47aa-a6dd-64f3f69c6d08" path="/var/lib/kubelet/pods/fbfe8139-a900-47aa-a6dd-64f3f69c6d08/volumes" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.823012 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "999b3a9f-9559-4baa-9f36-4f91631fb1fc" (UID: "999b3a9f-9559-4baa-9f36-4f91631fb1fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.910304 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.911652 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.911674 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:05 crc kubenswrapper[4822]: E1010 06:47:05.926929 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0df354f0_e9e8_441a_a676_8a6468b8c191.slice/crio-conmon-d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0df354f0_e9e8_441a_a676_8a6468b8c191.slice/crio-d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod419c8ee7_56fd_43cc_86de_7f647c708502.slice/crio-54d693b162a1803c2583c5834b198cb4544f2c4815b65ef898c0cea5e7667146.scope\": RecentStats: unable to find data in memory cache]" Oct 10 06:47:05 crc kubenswrapper[4822]: I1010 06:47:05.970341 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.016619 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.037619 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.074980 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.087951 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "241b1f65-5edb-4965-b9af-e8e12b73124c" (UID: "241b1f65-5edb-4965-b9af-e8e12b73124c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.133922 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.139199 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.139261 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.139273 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.139301 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data podName:48fba34a-0289-41f0-b1d7-bb71a22253a3 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:10.139283697 +0000 UTC m=+1377.234441893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data") pod "rabbitmq-server-0" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3") : configmap "rabbitmq-config-data" not found Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.139344 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data podName:1fa59157-6b4e-4379-89e0-415e74c581a8 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:10.139323968 +0000 UTC m=+1377.234482204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data") pod "rabbitmq-cell1-server-0" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8") : configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.139666 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.139682 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.193173 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-config" (OuterVolumeSpecName: "config") pod "999b3a9f-9559-4baa-9f36-4f91631fb1fc" (UID: "999b3a9f-9559-4baa-9f36-4f91631fb1fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.222395 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" exitCode=0 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.256721 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.261488 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "999b3a9f-9559-4baa-9f36-4f91631fb1fc" (UID: "999b3a9f-9559-4baa-9f36-4f91631fb1fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.277148 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "999b3a9f-9559-4baa-9f36-4f91631fb1fc" (UID: "999b3a9f-9559-4baa-9f36-4f91631fb1fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.280886 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.290977 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c8cd3778-5a5a-483a-af22-5b8420ae896b" (UID: "c8cd3778-5a5a-483a-af22-5b8420ae896b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.291609 4822 generic.go:334] "Generic (PLEG): container finished" podID="d5180126-ac55-464c-90dd-565daffba54c" containerID="ff627b4112d5afa4366e1239ee8cc64edf816c0effeb7dafc7e69ae5cc1194ec" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.304934 4822 generic.go:334] "Generic (PLEG): container finished" podID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerID="3eab14a45cceb25e06794fdc7e4a763b3c3af3f0e97c2df5eb1c6cd89012256a" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.309923 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.311823 4822 generic.go:334] "Generic (PLEG): container finished" podID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerID="6bab53c5a6f89332c0aa8004dfd4394eade34796f5bf17f77ba2d71af6be542f" exitCode=0 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.311859 4822 generic.go:334] "Generic (PLEG): container finished" podID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerID="efce444dc287af620565305af2a58baca7c83295b3bd9c0cfb54af7d64975eef" exitCode=0 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.332979 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "999b3a9f-9559-4baa-9f36-4f91631fb1fc" (UID: "999b3a9f-9559-4baa-9f36-4f91631fb1fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.334003 4822 generic.go:334] "Generic (PLEG): container finished" podID="3d602476-cde4-435f-93bc-a72c137d1b58" containerID="72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.343913 4822 generic.go:334] "Generic (PLEG): container finished" podID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerID="d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52" exitCode=0 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.343941 4822 generic.go:334] "Generic (PLEG): container finished" podID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerID="065d149d1b211170467b39093535c955aae1f107db060308f9576b5afffb25a6" exitCode=0 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.345232 4822 generic.go:334] "Generic (PLEG): container finished" podID="419c8ee7-56fd-43cc-86de-7f647c708502" containerID="54d693b162a1803c2583c5834b198cb4544f2c4815b65ef898c0cea5e7667146" exitCode=0 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.351184 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.359145 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8cd3778-5a5a-483a-af22-5b8420ae896b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.359180 4822 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.359193 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.359206 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.359242 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/999b3a9f-9559-4baa-9f36-4f91631fb1fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.359257 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.359193 4822 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.359287 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.359386 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.359404 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:06 crc kubenswrapper[4822]: E1010 06:47:06.359454 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:47:10.359435969 +0000 UTC m=+1377.454594165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.361471 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "22236db0-c666-44e4-a290-66626e76cdad" (UID: "22236db0-c666-44e4-a290-66626e76cdad"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.363879 4822 generic.go:334] "Generic (PLEG): container finished" podID="14ce9853-109f-456d-b51c-b1d11072a90d" containerID="9baba77c235fa2da53a1933646bcf77963daf0d1bceccb6370d40760af07bd34" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.368292 4822 generic.go:334] "Generic (PLEG): container finished" podID="35854fe5-2e29-4a49-9783-873bee1058e2" containerID="489b35ea36a581d0ca50df223a61ef6883583aa6d15472a109b5513aa80e1f47" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.371008 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6l7hn_241b1f65-5edb-4965-b9af-e8e12b73124c/openstack-network-exporter/0.log" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.371133 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6l7hn" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.382975 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerDied","Data":"3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383014 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5180126-ac55-464c-90dd-565daffba54c","Type":"ContainerDied","Data":"ff627b4112d5afa4366e1239ee8cc64edf816c0effeb7dafc7e69ae5cc1194ec"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383029 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0da5e90-c960-4d67-9c19-6854f61dee14","Type":"ContainerDied","Data":"3eab14a45cceb25e06794fdc7e4a763b3c3af3f0e97c2df5eb1c6cd89012256a"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383041 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a","Type":"ContainerDied","Data":"6bab53c5a6f89332c0aa8004dfd4394eade34796f5bf17f77ba2d71af6be542f"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383053 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a","Type":"ContainerDied","Data":"efce444dc287af620565305af2a58baca7c83295b3bd9c0cfb54af7d64975eef"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383062 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" event={"ID":"3d602476-cde4-435f-93bc-a72c137d1b58","Type":"ContainerDied","Data":"72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383073 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b5485c95f-w8q56" event={"ID":"0df354f0-e9e8-441a-a676-8a6468b8c191","Type":"ContainerDied","Data":"d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383083 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b5485c95f-w8q56" event={"ID":"0df354f0-e9e8-441a-a676-8a6468b8c191","Type":"ContainerDied","Data":"065d149d1b211170467b39093535c955aae1f107db060308f9576b5afffb25a6"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383092 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"419c8ee7-56fd-43cc-86de-7f647c708502","Type":"ContainerDied","Data":"54d693b162a1803c2583c5834b198cb4544f2c4815b65ef898c0cea5e7667146"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383105 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9h2hz" event={"ID":"999b3a9f-9559-4baa-9f36-4f91631fb1fc","Type":"ContainerDied","Data":"da6b134ceab1bb885c711edb9da894b922f48bb12a8888eb6880b76e528a90c9"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383115 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement16c0-account-delete-dzcpq" event={"ID":"7aa25f12-05b2-4632-921a-d126059a63be","Type":"ContainerDied","Data":"5f17bd753149822470d3379e7b6816046320eaa0d3cff344e216fd08311e7f48"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383126 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f17bd753149822470d3379e7b6816046320eaa0d3cff344e216fd08311e7f48" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383135 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"14ce9853-109f-456d-b51c-b1d11072a90d","Type":"ContainerDied","Data":"9baba77c235fa2da53a1933646bcf77963daf0d1bceccb6370d40760af07bd34"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383146 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" event={"ID":"35854fe5-2e29-4a49-9783-873bee1058e2","Type":"ContainerDied","Data":"489b35ea36a581d0ca50df223a61ef6883583aa6d15472a109b5513aa80e1f47"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383158 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6l7hn" event={"ID":"241b1f65-5edb-4965-b9af-e8e12b73124c","Type":"ContainerDied","Data":"dae98b524e6eefa094b20d50b4a5d2c970bd592f62b81450d3ec13fb4760fc38"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.383175 4822 scope.go:117] "RemoveContainer" containerID="37b4d43e434fbc8b371e9636d76d9af8856d96fcc8463cdd3a2a306557a6b929" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.395259 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_108483bc-0a52-4ac2-8086-fa89466ea3aa/ovsdbserver-nb/0.log" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.395379 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"108483bc-0a52-4ac2-8086-fa89466ea3aa","Type":"ContainerDied","Data":"606e62ba4b6b8392d156769f63c12dbf600698294d4209f882b2b5af761dde6c"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.395485 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.408692 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "241b1f65-5edb-4965-b9af-e8e12b73124c" (UID: "241b1f65-5edb-4965-b9af-e8e12b73124c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.409043 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.409970 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "108483bc-0a52-4ac2-8086-fa89466ea3aa" (UID: "108483bc-0a52-4ac2-8086-fa89466ea3aa"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.425058 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerID="ad3e61286e546aa785f998bc78158faaf28a9227d533f3ce1eeace9ba41ab652" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.425179 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d11cec-09a9-4adc-9889-cc90f8b983e1","Type":"ContainerDied","Data":"ad3e61286e546aa785f998bc78158faaf28a9227d533f3ce1eeace9ba41ab652"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.428183 4822 generic.go:334] "Generic (PLEG): container finished" podID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerID="a1174022bfa90fdbbc7bdb6448ded60ea8ea9a3effbb5e0c5631a9b1f7bfe1e1" exitCode=143 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.428260 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b5444654f-5wp86" event={"ID":"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69","Type":"ContainerDied","Data":"a1174022bfa90fdbbc7bdb6448ded60ea8ea9a3effbb5e0c5631a9b1f7bfe1e1"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.434968 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_22236db0-c666-44e4-a290-66626e76cdad/ovsdbserver-sb/0.log" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.435031 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22236db0-c666-44e4-a290-66626e76cdad","Type":"ContainerDied","Data":"f540e77c53df470bad8017634d1f6c2f94da2895ba72b0aba59dbab08662ea7c"} Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.435141 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.440791 4822 scope.go:117] "RemoveContainer" containerID="203a642bf3c3f9b4af40226e7974ac31a360eac399b610e485b7214df187d558" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.471312 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/7aa25f12-05b2-4632-921a-d126059a63be-kube-api-access-dq8p7\") pod \"7aa25f12-05b2-4632-921a-d126059a63be\" (UID: \"7aa25f12-05b2-4632-921a-d126059a63be\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.471847 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22236db0-c666-44e4-a290-66626e76cdad-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.471873 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/241b1f65-5edb-4965-b9af-e8e12b73124c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.471885 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/108483bc-0a52-4ac2-8086-fa89466ea3aa-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.475007 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9h2hz"] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.493338 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9h2hz"] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.501023 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa25f12-05b2-4632-921a-d126059a63be-kube-api-access-dq8p7" (OuterVolumeSpecName: "kube-api-access-dq8p7") pod "7aa25f12-05b2-4632-921a-d126059a63be" (UID: "7aa25f12-05b2-4632-921a-d126059a63be"). InnerVolumeSpecName "kube-api-access-dq8p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.571932 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.575235 4822 scope.go:117] "RemoveContainer" containerID="1efb57ad3e93a9791822615f0a82caa8998c856b0e2fc40cd1c40d4fa6e98d56" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.575977 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/7aa25f12-05b2-4632-921a-d126059a63be-kube-api-access-dq8p7\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.588753 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.734174 4822 scope.go:117] "RemoveContainer" containerID="80e86c1e4ef3bdd1b551d89f9e5f8eac8c6d612c80b88f1c7b4a89756edf38ef" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.765254 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.812050 4822 scope.go:117] "RemoveContainer" containerID="02cf6c43ff65c5cfd73aa24c2571a632c72706b61bec1beff2c761fecf4a63ac" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.836980 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder5f70-account-delete-kxtp9"] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.844435 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican9441-account-delete-zxr9g"] Oct 10 06:47:06 crc kubenswrapper[4822]: W1010 06:47:06.857959 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc72727a_70e5_402e_90f1_2c54c48dd5f8.slice/crio-d728381013137c6b810bd10f751a79667e9e1b3ec2cdbaf8b5ee408b4ec28624 WatchSource:0}: Error finding container d728381013137c6b810bd10f751a79667e9e1b3ec2cdbaf8b5ee408b4ec28624: Status 404 returned error can't find the container with id d728381013137c6b810bd10f751a79667e9e1b3ec2cdbaf8b5ee408b4ec28624 Oct 10 06:47:06 crc kubenswrapper[4822]: W1010 06:47:06.865998 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07072cb_ae19_4dcb_9f52_432fe923949d.slice/crio-53dbd27625668b4d44abfc598493fb4a5c6ececdfbfff7c832320fb49ece7bf8 WatchSource:0}: Error finding container 53dbd27625668b4d44abfc598493fb4a5c6ececdfbfff7c832320fb49ece7bf8: Status 404 returned error can't find the container with id 53dbd27625668b4d44abfc598493fb4a5c6ececdfbfff7c832320fb49ece7bf8 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.869232 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.871234 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell07142-account-delete-jddpj"] Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892115 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glb7l\" (UniqueName: \"kubernetes.io/projected/419c8ee7-56fd-43cc-86de-7f647c708502-kube-api-access-glb7l\") pod \"419c8ee7-56fd-43cc-86de-7f647c708502\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892171 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-internal-tls-certs\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892214 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-run-httpd\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892260 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-combined-ca-bundle\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892338 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-nova-novncproxy-tls-certs\") pod \"419c8ee7-56fd-43cc-86de-7f647c708502\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892356 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-config-data\") pod \"419c8ee7-56fd-43cc-86de-7f647c708502\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892379 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-config-data\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892408 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xffb6\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-kube-api-access-xffb6\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892463 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-combined-ca-bundle\") pod \"419c8ee7-56fd-43cc-86de-7f647c708502\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892484 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-etc-swift\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892536 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-vencrypt-tls-certs\") pod \"419c8ee7-56fd-43cc-86de-7f647c708502\" (UID: \"419c8ee7-56fd-43cc-86de-7f647c708502\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892572 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-log-httpd\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.892603 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-public-tls-certs\") pod \"0df354f0-e9e8-441a-a676-8a6468b8c191\" (UID: \"0df354f0-e9e8-441a-a676-8a6468b8c191\") " Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.898312 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.901466 4822 scope.go:117] "RemoveContainer" containerID="365d4ada94cc75d05b4668ee6f928bae56736987f1c58600aa5915af7eebd8f7" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.906563 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.907220 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.907543 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.912201 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapibdb6-account-delete-qgrcb"] Oct 10 06:47:06 crc kubenswrapper[4822]: W1010 06:47:06.912686 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52264dc7_4118_484f_ab12_1bfd17172c20.slice/crio-50d159aac72227ee1187b2a0cd8cff14aa3adefd530efa833248f31eee9c9e76 WatchSource:0}: Error finding container 50d159aac72227ee1187b2a0cd8cff14aa3adefd530efa833248f31eee9c9e76: Status 404 returned error can't find the container with id 50d159aac72227ee1187b2a0cd8cff14aa3adefd530efa833248f31eee9c9e76 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.912912 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.924349 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/419c8ee7-56fd-43cc-86de-7f647c708502-kube-api-access-glb7l" (OuterVolumeSpecName: "kube-api-access-glb7l") pod "419c8ee7-56fd-43cc-86de-7f647c708502" (UID: "419c8ee7-56fd-43cc-86de-7f647c708502"). InnerVolumeSpecName "kube-api-access-glb7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.925535 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-kube-api-access-xffb6" (OuterVolumeSpecName: "kube-api-access-xffb6") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "kube-api-access-xffb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.935267 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6l7hn"] Oct 10 06:47:06 crc kubenswrapper[4822]: W1010 06:47:06.956811 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c2b7a8e_ab63_4d56_929e_1e6898294956.slice/crio-56e75bce81c48a887bdeb4fc79b5839c01bb4302d31801293ddf013cbb05f523 WatchSource:0}: Error finding container 56e75bce81c48a887bdeb4fc79b5839c01bb4302d31801293ddf013cbb05f523: Status 404 returned error can't find the container with id 56e75bce81c48a887bdeb4fc79b5839c01bb4302d31801293ddf013cbb05f523 Oct 10 06:47:06 crc kubenswrapper[4822]: I1010 06:47:06.984936 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron834d-account-delete-j5ft4"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.006993 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vksr\" (UniqueName: \"kubernetes.io/projected/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-kube-api-access-8vksr\") pod \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.007073 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data-custom\") pod \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.007108 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-scripts\") pod \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.007184 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data\") pod \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.007210 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-etc-machine-id\") pod \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.007340 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-combined-ca-bundle\") pod \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\" (UID: \"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.010565 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xffb6\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-kube-api-access-xffb6\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.010598 4822 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0df354f0-e9e8-441a-a676-8a6468b8c191-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.010624 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0df354f0-e9e8-441a-a676-8a6468b8c191-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.010635 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glb7l\" (UniqueName: \"kubernetes.io/projected/419c8ee7-56fd-43cc-86de-7f647c708502-kube-api-access-glb7l\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.013208 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" (UID: "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.015389 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" (UID: "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.015941 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-config-data" (OuterVolumeSpecName: "config-data") pod "419c8ee7-56fd-43cc-86de-7f647c708502" (UID: "419c8ee7-56fd-43cc-86de-7f647c708502"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.021075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-scripts" (OuterVolumeSpecName: "scripts") pod "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" (UID: "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.022223 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-kube-api-access-8vksr" (OuterVolumeSpecName: "kube-api-access-8vksr") pod "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" (UID: "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"). InnerVolumeSpecName "kube-api-access-8vksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.040495 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6l7hn"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.042196 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.043911 4822 scope.go:117] "RemoveContainer" containerID="c3480d312ca94790932ca6354883bc03b463bd75e6035ae95f7c287a9d29849c" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.058873 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "419c8ee7-56fd-43cc-86de-7f647c708502" (UID: "419c8ee7-56fd-43cc-86de-7f647c708502"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.070214 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell11f81-account-delete-mz2rx"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.078441 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-config-data" (OuterVolumeSpecName: "config-data") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.078904 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.081100 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.081591 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "419c8ee7-56fd-43cc-86de-7f647c708502" (UID: "419c8ee7-56fd-43cc-86de-7f647c708502"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.090211 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112355 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vksr\" (UniqueName: \"kubernetes.io/projected/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-kube-api-access-8vksr\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112394 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112409 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112424 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112435 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112448 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112460 4822 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112471 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112483 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.112494 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.165157 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "419c8ee7-56fd-43cc-86de-7f647c708502" (UID: "419c8ee7-56fd-43cc-86de-7f647c708502"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.191292 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" (UID: "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.214749 4822 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/419c8ee7-56fd-43cc-86de-7f647c708502-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.214793 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.226970 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.227257 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-central-agent" containerID="cri-o://4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f" gracePeriod=30 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.227648 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="proxy-httpd" containerID="cri-o://8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9" gracePeriod=30 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.227699 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="sg-core" containerID="cri-o://80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46" gracePeriod=30 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.227729 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-notification-agent" containerID="cri-o://449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13" gracePeriod=30 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.234937 4822 scope.go:117] "RemoveContainer" containerID="85d5df3b347c7c673d52d3c41f938ec4fc9dd982fd6b9414a88fc92d459a23ab" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.238100 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.238276 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="afb8479c-8058-4a41-9bc7-8fd09bd321d8" containerName="kube-state-metrics" containerID="cri-o://c7baa3dcfc861f3f0c1cd8711d2244d5d331b16f63e69a27aa93f37eaaee5a7d" gracePeriod=30 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.282253 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0df354f0-e9e8-441a-a676-8a6468b8c191" (UID: "0df354f0-e9e8-441a-a676-8a6468b8c191"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.291092 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": read tcp 10.217.0.2:40740->10.217.0.166:8776: read: connection reset by peer" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.323282 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0df354f0-e9e8-441a-a676-8a6468b8c191-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.425333 4822 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.425420 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:11.425401927 +0000 UTC m=+1378.520560123 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-scripts" not found Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.425790 4822 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.425854 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data podName:14ce9853-109f-456d-b51c-b1d11072a90d nodeName:}" failed. No retries permitted until 2025-10-10 06:47:11.425842769 +0000 UTC m=+1378.521000965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data") pod "glance-default-external-api-0" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d") : secret "glance-default-external-config-data" not found Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.460433 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9441-account-delete-zxr9g" event={"ID":"c07072cb-ae19-4dcb-9f52-432fe923949d","Type":"ContainerStarted","Data":"53dbd27625668b4d44abfc598493fb4a5c6ececdfbfff7c832320fb49ece7bf8"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.463086 4822 generic.go:334] "Generic (PLEG): container finished" podID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" containerID="81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0" exitCode=0 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.463177 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ddf4e23c-3df4-4d67-8a61-c97f860aa797","Type":"ContainerDied","Data":"81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.463207 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ddf4e23c-3df4-4d67-8a61-c97f860aa797","Type":"ContainerDied","Data":"f56fc2714e27d1f9448430930d11b3f9255f9e797b575f549634fee4c9b67e28"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.463222 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f56fc2714e27d1f9448430930d11b3f9255f9e797b575f549634fee4c9b67e28" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.470488 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder5f70-account-delete-kxtp9" event={"ID":"dc72727a-70e5-402e-90f1-2c54c48dd5f8","Type":"ContainerStarted","Data":"d728381013137c6b810bd10f751a79667e9e1b3ec2cdbaf8b5ee408b4ec28624"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.482937 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data" (OuterVolumeSpecName: "config-data") pod "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" (UID: "7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.485288 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d3cbb68-b8c5-44fa-bd93-f70ad796d01a","Type":"ContainerDied","Data":"9a144ed9a820acef1b717593380d295971f0f0cec69bbca5b011fa52fe058d12"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.485350 4822 scope.go:117] "RemoveContainer" containerID="6bab53c5a6f89332c0aa8004dfd4394eade34796f5bf17f77ba2d71af6be542f" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.485507 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.493862 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07142-account-delete-jddpj" event={"ID":"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86","Type":"ContainerStarted","Data":"b9116d6ea01101ec61e3dfe7b9540f5bc7b5c158af7c6f45265e7380b9e6c2ec"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.518680 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b5485c95f-w8q56" event={"ID":"0df354f0-e9e8-441a-a676-8a6468b8c191","Type":"ContainerDied","Data":"3eff87da3d7270cf39f79adfd026a933f6d7bb955dfb31a8855aafe4be4f1832"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.518784 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.546692 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.565609 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron834d-account-delete-j5ft4" event={"ID":"52264dc7-4118-484f-ab12-1bfd17172c20","Type":"ContainerStarted","Data":"50d159aac72227ee1187b2a0cd8cff14aa3adefd530efa833248f31eee9c9e76"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.601662 4822 generic.go:334] "Generic (PLEG): container finished" podID="afb8479c-8058-4a41-9bc7-8fd09bd321d8" containerID="c7baa3dcfc861f3f0c1cd8711d2244d5d331b16f63e69a27aa93f37eaaee5a7d" exitCode=2 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.602055 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"afb8479c-8058-4a41-9bc7-8fd09bd321d8","Type":"ContainerDied","Data":"c7baa3dcfc861f3f0c1cd8711d2244d5d331b16f63e69a27aa93f37eaaee5a7d"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.613703 4822 generic.go:334] "Generic (PLEG): container finished" podID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerID="a1f35f0eceeefdb49f84dae81b99596158510c253b94b22659bccc55d2420d00" exitCode=0 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.613880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc97c446d-qd577" event={"ID":"3420c1f4-bf0d-4de6-90a4-c00e0722d911","Type":"ContainerDied","Data":"a1f35f0eceeefdb49f84dae81b99596158510c253b94b22659bccc55d2420d00"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.640528 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibdb6-account-delete-qgrcb" event={"ID":"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9","Type":"ContainerStarted","Data":"15926c83561cae90325d5f80ad945a0b63442ffd283e3fc2b6c9d685de9d3c6a"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.657995 4822 generic.go:334] "Generic (PLEG): container finished" podID="f0da7840-eaa9-46a7-bda6-5de928993572" containerID="5768417b026ddc15a2a8c2d6da91a0f1f8ec8b7c89708cc85cf21fe86db42db9" exitCode=0 Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.660277 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.673307 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.673462 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" path="/var/lib/kubelet/pods/108483bc-0a52-4ac2-8086-fa89466ea3aa/volumes" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.674164 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22236db0-c666-44e4-a290-66626e76cdad" path="/var/lib/kubelet/pods/22236db0-c666-44e4-a290-66626e76cdad/volumes" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.676440 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241b1f65-5edb-4965-b9af-e8e12b73124c" path="/var/lib/kubelet/pods/241b1f65-5edb-4965-b9af-e8e12b73124c/volumes" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.677145 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" path="/var/lib/kubelet/pods/999b3a9f-9559-4baa-9f36-4f91631fb1fc/volumes" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.678855 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cd3778-5a5a-483a-af22-5b8420ae896b" path="/var/lib/kubelet/pods/c8cd3778-5a5a-483a-af22-5b8420ae896b/volumes" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.680279 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0da7840-eaa9-46a7-bda6-5de928993572","Type":"ContainerDied","Data":"5768417b026ddc15a2a8c2d6da91a0f1f8ec8b7c89708cc85cf21fe86db42db9"} Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.689533 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.689596 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.694447 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell11f81-account-delete-mz2rx" event={"ID":"6c2b7a8e-ab63-4d56-929e-1e6898294956","Type":"ContainerStarted","Data":"56e75bce81c48a887bdeb4fc79b5839c01bb4302d31801293ddf013cbb05f523"} Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.694580 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0 is running failed: container process not found" containerID="81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.700646 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0 is running failed: container process not found" containerID="81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.719578 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0 is running failed: container process not found" containerID="81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.719646 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" containerName="nova-scheduler-scheduler" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.750396 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"419c8ee7-56fd-43cc-86de-7f647c708502","Type":"ContainerDied","Data":"236d517d91ce9915087f2a579c916e82cb4c9fe893ae6001daac69e5108647cf"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.751265 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.764514 4822 generic.go:334] "Generic (PLEG): container finished" podID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerID="80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46" exitCode=2 Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.764585 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerDied","Data":"80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46"} Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.764619 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement16c0-account-delete-dzcpq" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.876190 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.896216 4822 scope.go:117] "RemoveContainer" containerID="efce444dc287af620565305af2a58baca7c83295b3bd9c0cfb54af7d64975eef" Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.924000 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.929140 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.937066 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.938392 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.938434 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.945405 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5jdbx" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" probeResult="failure" output=< Oct 10 06:47:07 crc kubenswrapper[4822]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 10 06:47:07 crc kubenswrapper[4822]: > Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.945480 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.954969 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-config-data\") pod \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.955148 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-combined-ca-bundle\") pod \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.955218 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkbjs\" (UniqueName: \"kubernetes.io/projected/ddf4e23c-3df4-4d67-8a61-c97f860aa797-kube-api-access-zkbjs\") pod \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\" (UID: \"ddf4e23c-3df4-4d67-8a61-c97f860aa797\") " Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.962858 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:07 crc kubenswrapper[4822]: E1010 06:47:07.962918 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:07 crc kubenswrapper[4822]: I1010 06:47:07.968951 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf4e23c-3df4-4d67-8a61-c97f860aa797-kube-api-access-zkbjs" (OuterVolumeSpecName: "kube-api-access-zkbjs") pod "ddf4e23c-3df4-4d67-8a61-c97f860aa797" (UID: "ddf4e23c-3df4-4d67-8a61-c97f860aa797"). InnerVolumeSpecName "kube-api-access-zkbjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.013561 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddf4e23c-3df4-4d67-8a61-c97f860aa797" (UID: "ddf4e23c-3df4-4d67-8a61-c97f860aa797"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.018409 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057409 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-internal-tls-certs\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057461 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-combined-ca-bundle\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057557 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420c1f4-bf0d-4de6-90a4-c00e0722d911-logs\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057622 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-config-data\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057646 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57qqx\" (UniqueName: \"kubernetes.io/projected/3420c1f4-bf0d-4de6-90a4-c00e0722d911-kube-api-access-57qqx\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057712 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-scripts\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.057771 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-public-tls-certs\") pod \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\" (UID: \"3420c1f4-bf0d-4de6-90a4-c00e0722d911\") " Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.058430 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.058457 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkbjs\" (UniqueName: \"kubernetes.io/projected/ddf4e23c-3df4-4d67-8a61-c97f860aa797-kube-api-access-zkbjs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.067966 4822 scope.go:117] "RemoveContainer" containerID="d3951919e40efdfd0f35f5fc13324aff22e3d70520d47e9d226db9a844dd3d52" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.068478 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3420c1f4-bf0d-4de6-90a4-c00e0722d911-logs" (OuterVolumeSpecName: "logs") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.105703 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.105944 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" containerName="memcached" containerID="cri-o://a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8" gracePeriod=30 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.106385 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3420c1f4-bf0d-4de6-90a4-c00e0722d911-kube-api-access-57qqx" (OuterVolumeSpecName: "kube-api-access-57qqx") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "kube-api-access-57qqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.143931 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-scripts" (OuterVolumeSpecName: "scripts") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.178903 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mrdhs"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.180823 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.180855 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3420c1f4-bf0d-4de6-90a4-c00e0722d911-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.180865 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57qqx\" (UniqueName: \"kubernetes.io/projected/3420c1f4-bf0d-4de6-90a4-c00e0722d911-kube-api-access-57qqx\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.183153 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-config-data" (OuterVolumeSpecName: "config-data") pod "ddf4e23c-3df4-4d67-8a61-c97f860aa797" (UID: "ddf4e23c-3df4-4d67-8a61-c97f860aa797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.197278 4822 scope.go:117] "RemoveContainer" containerID="065d149d1b211170467b39093535c955aae1f107db060308f9576b5afffb25a6" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.222490 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.264958 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mkfd6"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.279988 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 06:47:08 crc kubenswrapper[4822]: E1010 06:47:08.282708 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.288503 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf4e23c-3df4-4d67-8a61-c97f860aa797-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.291515 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": dial tcp 10.217.0.204:8775: connect: connection refused" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.291599 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": dial tcp 10.217.0.204:8775: connect: connection refused" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.299753 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mrdhs"] Oct 10 06:47:08 crc kubenswrapper[4822]: E1010 06:47:08.309162 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.318579 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mkfd6"] Oct 10 06:47:08 crc kubenswrapper[4822]: E1010 06:47:08.319881 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:08 crc kubenswrapper[4822]: E1010 06:47:08.320024 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="09b24550-0f5f-46ff-bf11-192fa1f15650" containerName="nova-cell0-conductor-conductor" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.347404 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5b98d46cf9-66pcm"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.347689 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5b98d46cf9-66pcm" podUID="dc3ce4fd-4bba-4242-91ba-076cf3729770" containerName="keystone-api" containerID="cri-o://aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6" gracePeriod=30 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.363261 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement16c0-account-delete-dzcpq"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.370964 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement16c0-account-delete-dzcpq"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.379924 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.396880 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qn8pd"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.407846 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qn8pd"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.408040 4822 scope.go:117] "RemoveContainer" containerID="54d693b162a1803c2583c5834b198cb4544f2c4815b65ef898c0cea5e7667146" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.439090 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7142-account-create-bmfp7"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.445548 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell07142-account-delete-jddpj"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.451216 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7142-account-create-bmfp7"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.456850 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9hsvs"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.464539 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a920-account-create-kxwvs"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.469892 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9hsvs"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.474514 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a920-account-create-kxwvs"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.481860 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rxtp5"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.484530 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rxtp5"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.490163 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bdb6-account-create-9kt24"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.508651 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bdb6-account-create-9kt24"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.511525 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibdb6-account-delete-qgrcb"] Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.719655 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-config-data" (OuterVolumeSpecName: "config-data") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.803352 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron834d-account-delete-j5ft4" event={"ID":"52264dc7-4118-484f-ab12-1bfd17172c20","Type":"ContainerStarted","Data":"877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.803497 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron834d-account-delete-j5ft4" podUID="52264dc7-4118-484f-ab12-1bfd17172c20" containerName="mariadb-account-delete" containerID="cri-o://877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37" gracePeriod=30 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.813620 4822 generic.go:334] "Generic (PLEG): container finished" podID="35854fe5-2e29-4a49-9783-873bee1058e2" containerID="d17675142c2329604a137158115089b304e6ce04d8ec7938439c69f02db6cd14" exitCode=0 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.813678 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" event={"ID":"35854fe5-2e29-4a49-9783-873bee1058e2","Type":"ContainerDied","Data":"d17675142c2329604a137158115089b304e6ce04d8ec7938439c69f02db6cd14"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.848440 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.849109 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron834d-account-delete-j5ft4" podStartSLOduration=6.849085799 podStartE2EDuration="6.849085799s" podCreationTimestamp="2025-10-10 06:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:47:08.828353762 +0000 UTC m=+1375.923511968" watchObservedRunningTime="2025-10-10 06:47:08.849085799 +0000 UTC m=+1375.944243995" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.900159 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc97c446d-qd577" event={"ID":"3420c1f4-bf0d-4de6-90a4-c00e0722d911","Type":"ContainerDied","Data":"a86813d124296ecd04e3e363249f2055025a09990fd535e0d96795cb4849ed4b"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.900206 4822 scope.go:117] "RemoveContainer" containerID="a1f35f0eceeefdb49f84dae81b99596158510c253b94b22659bccc55d2420d00" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.900349 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc97c446d-qd577" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.910244 4822 generic.go:334] "Generic (PLEG): container finished" podID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerID="fda58ea5f9b88b81e2eae62a1675670199fddb3e9d024a70d2a8d75abe7fbe9f" exitCode=0 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.910325 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b5444654f-5wp86" event={"ID":"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69","Type":"ContainerDied","Data":"fda58ea5f9b88b81e2eae62a1675670199fddb3e9d024a70d2a8d75abe7fbe9f"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.913484 4822 generic.go:334] "Generic (PLEG): container finished" podID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerID="cea6be947bf52e2474312230db60ef4f19b18c57064bc9913aa1efa9a6406d53" exitCode=0 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.913551 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e5c43dc5-0a44-497d-8d7c-3a818ddf1735","Type":"ContainerDied","Data":"cea6be947bf52e2474312230db60ef4f19b18c57064bc9913aa1efa9a6406d53"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.913579 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e5c43dc5-0a44-497d-8d7c-3a818ddf1735","Type":"ContainerDied","Data":"7abbad523e93f4274c96aa54fd62cf3a2176d5ba999ead6cbca3d1ac1331ffe8"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.913594 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7abbad523e93f4274c96aa54fd62cf3a2176d5ba999ead6cbca3d1ac1331ffe8" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.918924 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerID="080914d126025b6e485b39c65b1fd29e1335110a9ec2a7216a930a1f2cc0164e" exitCode=0 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.919007 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d11cec-09a9-4adc-9889-cc90f8b983e1","Type":"ContainerDied","Data":"080914d126025b6e485b39c65b1fd29e1335110a9ec2a7216a930a1f2cc0164e"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.920334 4822 generic.go:334] "Generic (PLEG): container finished" podID="ba1c43c3-6e05-459e-9692-b8ddeb17e0e9" containerID="2670d1035b9acdfd45f5637b300e37d6a4c39796a1a60e3494ae45d0369ae659" exitCode=0 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.920385 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibdb6-account-delete-qgrcb" event={"ID":"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9","Type":"ContainerDied","Data":"2670d1035b9acdfd45f5637b300e37d6a4c39796a1a60e3494ae45d0369ae659"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.925483 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.961128 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="galera" containerID="cri-o://b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a" gracePeriod=30 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.961336 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder5f70-account-delete-kxtp9" event={"ID":"dc72727a-70e5-402e-90f1-2c54c48dd5f8","Type":"ContainerStarted","Data":"cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.961435 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder5f70-account-delete-kxtp9" podUID="dc72727a-70e5-402e-90f1-2c54c48dd5f8" containerName="mariadb-account-delete" containerID="cri-o://cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa" gracePeriod=30 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.966559 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.983480 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder5f70-account-delete-kxtp9" podStartSLOduration=6.983463809 podStartE2EDuration="6.983463809s" podCreationTimestamp="2025-10-10 06:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:47:08.980539835 +0000 UTC m=+1376.075698031" watchObservedRunningTime="2025-10-10 06:47:08.983463809 +0000 UTC m=+1376.078622005" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.983917 4822 generic.go:334] "Generic (PLEG): container finished" podID="6c2b7a8e-ab63-4d56-929e-1e6898294956" containerID="f10e534bb95fa2ebc7af37b8031bdbf1a65fb1ad6946670369531ead85fae14a" exitCode=1 Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.984043 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell11f81-account-delete-mz2rx" event={"ID":"6c2b7a8e-ab63-4d56-929e-1e6898294956","Type":"ContainerDied","Data":"56e75bce81c48a887bdeb4fc79b5839c01bb4302d31801293ddf013cbb05f523"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.984113 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e75bce81c48a887bdeb4fc79b5839c01bb4302d31801293ddf013cbb05f523" Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.984175 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell11f81-account-delete-mz2rx" event={"ID":"6c2b7a8e-ab63-4d56-929e-1e6898294956","Type":"ContainerDied","Data":"f10e534bb95fa2ebc7af37b8031bdbf1a65fb1ad6946670369531ead85fae14a"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.986252 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07142-account-delete-jddpj" event={"ID":"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86","Type":"ContainerStarted","Data":"5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b"} Oct 10 06:47:08 crc kubenswrapper[4822]: I1010 06:47:08.986417 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell07142-account-delete-jddpj" podUID="caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" containerName="mariadb-account-delete" containerID="cri-o://5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b" gracePeriod=30 Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.001946 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell07142-account-delete-jddpj" podStartSLOduration=6.001930311 podStartE2EDuration="6.001930311s" podCreationTimestamp="2025-10-10 06:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:47:09.001643593 +0000 UTC m=+1376.096801789" watchObservedRunningTime="2025-10-10 06:47:09.001930311 +0000 UTC m=+1376.097088517" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.002287 4822 generic.go:334] "Generic (PLEG): container finished" podID="d5180126-ac55-464c-90dd-565daffba54c" containerID="80843954eb6d7ec548593d47ef32a3ec88c616382899f1b6da51f373178452d9" exitCode=0 Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.002418 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5180126-ac55-464c-90dd-565daffba54c","Type":"ContainerDied","Data":"80843954eb6d7ec548593d47ef32a3ec88c616382899f1b6da51f373178452d9"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.019578 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0da7840-eaa9-46a7-bda6-5de928993572","Type":"ContainerDied","Data":"680c541b9a648baf6e66f4880e6462444f29a88894a1aeefcc10d2e4d497240b"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.019685 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680c541b9a648baf6e66f4880e6462444f29a88894a1aeefcc10d2e4d497240b" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.019994 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.037103 4822 generic.go:334] "Generic (PLEG): container finished" podID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerID="8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9" exitCode=0 Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.037136 4822 generic.go:334] "Generic (PLEG): container finished" podID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerID="4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f" exitCode=0 Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.037201 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerDied","Data":"8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.037244 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerDied","Data":"4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.040528 4822 generic.go:334] "Generic (PLEG): container finished" podID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerID="c06e8d4a5f40e2de362c673f9a2482416e108f4360a178ff0fccd0dcaf4c36e9" exitCode=0 Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.040579 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0da5e90-c960-4d67-9c19-6854f61dee14","Type":"ContainerDied","Data":"c06e8d4a5f40e2de362c673f9a2482416e108f4360a178ff0fccd0dcaf4c36e9"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.043521 4822 generic.go:334] "Generic (PLEG): container finished" podID="14ce9853-109f-456d-b51c-b1d11072a90d" containerID="0f9a24b9406d92324b02f103bd4f78da68951fba025fb39c169f20c3b804f0f8" exitCode=0 Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.043562 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"14ce9853-109f-456d-b51c-b1d11072a90d","Type":"ContainerDied","Data":"0f9a24b9406d92324b02f103bd4f78da68951fba025fb39c169f20c3b804f0f8"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.044026 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3420c1f4-bf0d-4de6-90a4-c00e0722d911" (UID: "3420c1f4-bf0d-4de6-90a4-c00e0722d911"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.058221 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.058205 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"afb8479c-8058-4a41-9bc7-8fd09bd321d8","Type":"ContainerDied","Data":"6ab5b78cb60f7f37682f2e378fd2956db10b4657a8c73cd19534becf33df569a"} Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.058500 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ab5b78cb60f7f37682f2e378fd2956db10b4657a8c73cd19534becf33df569a" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.068011 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.068050 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3420c1f4-bf0d-4de6-90a4-c00e0722d911-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.087092 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.269767 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.295366 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.312945 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.337295 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.342002 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.356945 4822 scope.go:117] "RemoveContainer" containerID="f37aae3b6d57636153fe7116fe10389d8dc0009af7d1ea6600168de93249a807" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.370566 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.381950 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-scripts\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382022 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382075 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data-custom\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382117 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-combined-ca-bundle\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382138 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382160 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-public-tls-certs\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382177 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0da7840-eaa9-46a7-bda6-5de928993572-logs\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382222 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0da7840-eaa9-46a7-bda6-5de928993572-etc-machine-id\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382237 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-internal-tls-certs\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.382299 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44f6n\" (UniqueName: \"kubernetes.io/projected/f0da7840-eaa9-46a7-bda6-5de928993572-kube-api-access-44f6n\") pod \"f0da7840-eaa9-46a7-bda6-5de928993572\" (UID: \"f0da7840-eaa9-46a7-bda6-5de928993572\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.383885 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0da7840-eaa9-46a7-bda6-5de928993572-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.383920 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0da7840-eaa9-46a7-bda6-5de928993572-logs" (OuterVolumeSpecName: "logs") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.388124 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.389497 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0da7840-eaa9-46a7-bda6-5de928993572-kube-api-access-44f6n" (OuterVolumeSpecName: "kube-api-access-44f6n") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "kube-api-access-44f6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.404107 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-scripts" (OuterVolumeSpecName: "scripts") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.417279 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fc97c446d-qd577"] Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.419931 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.456456 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.470176 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6fc97c446d-qd577"] Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.474145 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.475823 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484425 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcqcr\" (UniqueName: \"kubernetes.io/projected/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-api-access-qcqcr\") pod \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484483 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-combined-ca-bundle\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484523 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79kjl\" (UniqueName: \"kubernetes.io/projected/c0da5e90-c960-4d67-9c19-6854f61dee14-kube-api-access-79kjl\") pod \"c0da5e90-c960-4d67-9c19-6854f61dee14\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484580 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-config\") pod \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484619 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kolla-config\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484652 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-certs\") pod \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484675 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-combined-ca-bundle\") pod \"c0da5e90-c960-4d67-9c19-6854f61dee14\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484711 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-config-data\") pod \"c0da5e90-c960-4d67-9c19-6854f61dee14\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484818 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-generated\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484847 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-default\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484873 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484903 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-combined-ca-bundle\") pod \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\" (UID: \"afb8479c-8058-4a41-9bc7-8fd09bd321d8\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484932 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5drg2\" (UniqueName: \"kubernetes.io/projected/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kube-api-access-5drg2\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484949 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-secrets\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.484984 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-galera-tls-certs\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485009 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-internal-tls-certs\") pod \"c0da5e90-c960-4d67-9c19-6854f61dee14\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485047 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0da5e90-c960-4d67-9c19-6854f61dee14-logs\") pod \"c0da5e90-c960-4d67-9c19-6854f61dee14\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485083 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-public-tls-certs\") pod \"c0da5e90-c960-4d67-9c19-6854f61dee14\" (UID: \"c0da5e90-c960-4d67-9c19-6854f61dee14\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485142 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4srsc\" (UniqueName: \"kubernetes.io/projected/6c2b7a8e-ab63-4d56-929e-1e6898294956-kube-api-access-4srsc\") pod \"6c2b7a8e-ab63-4d56-929e-1e6898294956\" (UID: \"6c2b7a8e-ab63-4d56-929e-1e6898294956\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485163 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-operator-scripts\") pod \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\" (UID: \"e5c43dc5-0a44-497d-8d7c-3a818ddf1735\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485550 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485562 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0da7840-eaa9-46a7-bda6-5de928993572-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485570 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0da7840-eaa9-46a7-bda6-5de928993572-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485580 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44f6n\" (UniqueName: \"kubernetes.io/projected/f0da7840-eaa9-46a7-bda6-5de928993572-kube-api-access-44f6n\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.485590 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.499421 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.508076 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0da5e90-c960-4d67-9c19-6854f61dee14-logs" (OuterVolumeSpecName: "logs") pod "c0da5e90-c960-4d67-9c19-6854f61dee14" (UID: "c0da5e90-c960-4d67-9c19-6854f61dee14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.517954 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-secrets" (OuterVolumeSpecName: "secrets") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.518468 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.519001 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.519332 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.523920 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kube-api-access-5drg2" (OuterVolumeSpecName: "kube-api-access-5drg2") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "kube-api-access-5drg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.532626 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.551748 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-api-access-qcqcr" (OuterVolumeSpecName: "kube-api-access-qcqcr") pod "afb8479c-8058-4a41-9bc7-8fd09bd321d8" (UID: "afb8479c-8058-4a41-9bc7-8fd09bd321d8"). InnerVolumeSpecName "kube-api-access-qcqcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.562005 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0da5e90-c960-4d67-9c19-6854f61dee14-kube-api-access-79kjl" (OuterVolumeSpecName: "kube-api-access-79kjl") pod "c0da5e90-c960-4d67-9c19-6854f61dee14" (UID: "c0da5e90-c960-4d67-9c19-6854f61dee14"). InnerVolumeSpecName "kube-api-access-79kjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.574358 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.578008 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2b7a8e-ab63-4d56-929e-1e6898294956-kube-api-access-4srsc" (OuterVolumeSpecName: "kube-api-access-4srsc") pod "6c2b7a8e-ab63-4d56-929e-1e6898294956" (UID: "6c2b7a8e-ab63-4d56-929e-1e6898294956"). InnerVolumeSpecName "kube-api-access-4srsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586683 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-httpd-run\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586732 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-scripts\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586794 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-combined-ca-bundle\") pod \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586846 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-config-data\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586872 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-internal-tls-certs\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586919 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35854fe5-2e29-4a49-9783-873bee1058e2-logs\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586942 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4k2g\" (UniqueName: \"kubernetes.io/projected/d5180126-ac55-464c-90dd-565daffba54c-kube-api-access-v4k2g\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.586989 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-public-tls-certs\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587014 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587069 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-logs\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587092 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-config-data\") pod \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587117 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data-custom\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587243 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-nova-metadata-tls-certs\") pod \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587282 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d11cec-09a9-4adc-9889-cc90f8b983e1-logs\") pod \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587305 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrv64\" (UniqueName: \"kubernetes.io/projected/35854fe5-2e29-4a49-9783-873bee1058e2-kube-api-access-zrv64\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587336 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-combined-ca-bundle\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587368 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-internal-tls-certs\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587391 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data\") pod \"35854fe5-2e29-4a49-9783-873bee1058e2\" (UID: \"35854fe5-2e29-4a49-9783-873bee1058e2\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587430 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-combined-ca-bundle\") pod \"d5180126-ac55-464c-90dd-565daffba54c\" (UID: \"d5180126-ac55-464c-90dd-565daffba54c\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587468 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqrrk\" (UniqueName: \"kubernetes.io/projected/b5d11cec-09a9-4adc-9889-cc90f8b983e1-kube-api-access-kqrrk\") pod \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\" (UID: \"b5d11cec-09a9-4adc-9889-cc90f8b983e1\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587971 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.587995 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4srsc\" (UniqueName: \"kubernetes.io/projected/6c2b7a8e-ab63-4d56-929e-1e6898294956-kube-api-access-4srsc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588009 4822 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588020 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcqcr\" (UniqueName: \"kubernetes.io/projected/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-api-access-qcqcr\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588032 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79kjl\" (UniqueName: \"kubernetes.io/projected/c0da5e90-c960-4d67-9c19-6854f61dee14-kube-api-access-79kjl\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588040 4822 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588048 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588056 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588064 4822 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588072 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5drg2\" (UniqueName: \"kubernetes.io/projected/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-kube-api-access-5drg2\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.588083 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0da5e90-c960-4d67-9c19-6854f61dee14-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.591106 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d11cec-09a9-4adc-9889-cc90f8b983e1-kube-api-access-kqrrk" (OuterVolumeSpecName: "kube-api-access-kqrrk") pod "b5d11cec-09a9-4adc-9889-cc90f8b983e1" (UID: "b5d11cec-09a9-4adc-9889-cc90f8b983e1"). InnerVolumeSpecName "kube-api-access-kqrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.591634 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.594984 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-scripts" (OuterVolumeSpecName: "scripts") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.595844 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35854fe5-2e29-4a49-9783-873bee1058e2-logs" (OuterVolumeSpecName: "logs") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.602886 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-logs" (OuterVolumeSpecName: "logs") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.606088 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.609301 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.613260 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5180126-ac55-464c-90dd-565daffba54c-kube-api-access-v4k2g" (OuterVolumeSpecName: "kube-api-access-v4k2g") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "kube-api-access-v4k2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.619761 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.621099 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d11cec-09a9-4adc-9889-cc90f8b983e1-logs" (OuterVolumeSpecName: "logs") pod "b5d11cec-09a9-4adc-9889-cc90f8b983e1" (UID: "b5d11cec-09a9-4adc-9889-cc90f8b983e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.623125 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35854fe5-2e29-4a49-9783-873bee1058e2-kube-api-access-zrv64" (OuterVolumeSpecName: "kube-api-access-zrv64") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "kube-api-access-zrv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.666636 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6" path="/var/lib/kubelet/pods/0ea8fdf5-e09c-4fb6-bc16-2d53bb95f0d6/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.667758 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175b9473-ce90-4e92-8322-f64cccbcb54b" path="/var/lib/kubelet/pods/175b9473-ce90-4e92-8322-f64cccbcb54b/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.668408 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" path="/var/lib/kubelet/pods/3420c1f4-bf0d-4de6-90a4-c00e0722d911/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.668991 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36b650a5-7a88-47df-a02f-87dc3eee6f89" path="/var/lib/kubelet/pods/36b650a5-7a88-47df-a02f-87dc3eee6f89/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.670361 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="419c8ee7-56fd-43cc-86de-7f647c708502" path="/var/lib/kubelet/pods/419c8ee7-56fd-43cc-86de-7f647c708502/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.671487 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb97152-e8c1-4fd0-befc-08ea47f79cdd" path="/var/lib/kubelet/pods/4fb97152-e8c1-4fd0-befc-08ea47f79cdd/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.671942 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb1f8a4-2601-4901-a55c-95fb18a9613c" path="/var/lib/kubelet/pods/6eb1f8a4-2601-4901-a55c-95fb18a9613c/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.672809 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa25f12-05b2-4632-921a-d126059a63be" path="/var/lib/kubelet/pods/7aa25f12-05b2-4632-921a-d126059a63be/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.673271 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c095c1b-93a3-4c9f-bea7-1d7e6310d06a" path="/var/lib/kubelet/pods/8c095c1b-93a3-4c9f-bea7-1d7e6310d06a/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.673713 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6c12c1-b88d-4488-9660-499070fbea2c" path="/var/lib/kubelet/pods/8e6c12c1-b88d-4488-9660-499070fbea2c/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.674150 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd98a0c-cdbf-437e-b488-c7c1f5c81326" path="/var/lib/kubelet/pods/9bd98a0c-cdbf-437e-b488-c7c1f5c81326/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.675235 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" path="/var/lib/kubelet/pods/ddf4e23c-3df4-4d67-8a61-c97f860aa797/volumes" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.689347 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-logs\") pod \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.689405 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data\") pod \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.689621 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4mfn\" (UniqueName: \"kubernetes.io/projected/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-kube-api-access-d4mfn\") pod \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.689750 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data-custom\") pod \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.689782 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-combined-ca-bundle\") pod \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\" (UID: \"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69\") " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691106 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d11cec-09a9-4adc-9889-cc90f8b983e1-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691140 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrv64\" (UniqueName: \"kubernetes.io/projected/35854fe5-2e29-4a49-9783-873bee1058e2-kube-api-access-zrv64\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691166 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691180 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqrrk\" (UniqueName: \"kubernetes.io/projected/b5d11cec-09a9-4adc-9889-cc90f8b983e1-kube-api-access-kqrrk\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691194 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691205 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691216 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35854fe5-2e29-4a49-9783-873bee1058e2-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691227 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4k2g\" (UniqueName: \"kubernetes.io/projected/d5180126-ac55-464c-90dd-565daffba54c-kube-api-access-v4k2g\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691243 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691254 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5180126-ac55-464c-90dd-565daffba54c-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.691266 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.694584 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-logs" (OuterVolumeSpecName: "logs") pod "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" (UID: "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.709285 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-kube-api-access-d4mfn" (OuterVolumeSpecName: "kube-api-access-d4mfn") pod "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" (UID: "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69"). InnerVolumeSpecName "kube-api-access-d4mfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.730263 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" (UID: "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.771095 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.792902 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4mfn\" (UniqueName: \"kubernetes.io/projected/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-kube-api-access-d4mfn\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.792938 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.792951 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.792964 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.877924 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-config-data" (OuterVolumeSpecName: "config-data") pod "c0da5e90-c960-4d67-9c19-6854f61dee14" (UID: "c0da5e90-c960-4d67-9c19-6854f61dee14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.882407 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.882406 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.902074 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.902098 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.902144 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.931767 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.933431 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.940487 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afb8479c-8058-4a41-9bc7-8fd09bd321d8" (UID: "afb8479c-8058-4a41-9bc7-8fd09bd321d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.970925 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "afb8479c-8058-4a41-9bc7-8fd09bd321d8" (UID: "afb8479c-8058-4a41-9bc7-8fd09bd321d8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.979569 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0da5e90-c960-4d67-9c19-6854f61dee14" (UID: "c0da5e90-c960-4d67-9c19-6854f61dee14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:09 crc kubenswrapper[4822]: I1010 06:47:09.979823 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5d11cec-09a9-4adc-9889-cc90f8b983e1" (UID: "b5d11cec-09a9-4adc-9889-cc90f8b983e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.003983 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.004021 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.004036 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.004047 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.004056 4822 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.004067 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.052983 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-config-data" (OuterVolumeSpecName: "config-data") pod "b5d11cec-09a9-4adc-9889-cc90f8b983e1" (UID: "b5d11cec-09a9-4adc-9889-cc90f8b983e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.053667 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.054248 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" (UID: "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.082869 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.085963 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c0da5e90-c960-4d67-9c19-6854f61dee14" (UID: "c0da5e90-c960-4d67-9c19-6854f61dee14"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.093292 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.101928 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e5c43dc5-0a44-497d-8d7c-3a818ddf1735" (UID: "e5c43dc5-0a44-497d-8d7c-3a818ddf1735"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.103159 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5180126-ac55-464c-90dd-565daffba54c","Type":"ContainerDied","Data":"d554140da2cdf5519348ac91880f18fdafb0e4c11af74d8ffaba26f817dabc77"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.103627 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"14ce9853-109f-456d-b51c-b1d11072a90d","Type":"ContainerDied","Data":"bd27b46ad44fbb693d221e97cb884bb4bdeb820c1a9927b20bb8f06d919ad15d"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.103666 4822 scope.go:117] "RemoveContainer" containerID="80843954eb6d7ec548593d47ef32a3ec88c616382899f1b6da51f373178452d9" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.111690 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.111728 4822 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c43dc5-0a44-497d-8d7c-3a818ddf1735-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.111741 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.111757 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.111768 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.113445 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.119264 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.118778 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d11cec-09a9-4adc-9889-cc90f8b983e1","Type":"ContainerDied","Data":"720f81325a5020549ed46f601002f3ce8017d12ce7ba9d5ef0d2fdf3b6e23cb6"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.120935 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.121232 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibdb6-account-delete-qgrcb" event={"ID":"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9","Type":"ContainerDied","Data":"15926c83561cae90325d5f80ad945a0b63442ffd283e3fc2b6c9d685de9d3c6a"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.121259 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15926c83561cae90325d5f80ad945a0b63442ffd283e3fc2b6c9d685de9d3c6a" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.121481 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibdb6-account-delete-qgrcb" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.124301 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b5444654f-5wp86" event={"ID":"11c95228-48ad-4e25-9cf7-bf0a2a1e4c69","Type":"ContainerDied","Data":"283e16ed7fc3f8e422839e721a64fe525ce7ce214a51cdf500c1ed137b76e879"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.124366 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b5444654f-5wp86" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.124540 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data" (OuterVolumeSpecName: "config-data") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.138296 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c0da5e90-c960-4d67-9c19-6854f61dee14" (UID: "c0da5e90-c960-4d67-9c19-6854f61dee14"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.143378 4822 generic.go:334] "Generic (PLEG): container finished" podID="09b24550-0f5f-46ff-bf11-192fa1f15650" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" exitCode=0 Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.143469 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"09b24550-0f5f-46ff-bf11-192fa1f15650","Type":"ContainerDied","Data":"136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.143499 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"09b24550-0f5f-46ff-bf11-192fa1f15650","Type":"ContainerDied","Data":"5281d1b387e797afe610f4ac5538b51ce7ca78e96aa44b4af4fb2e1356932856"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.143565 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.145345 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.148255 4822 scope.go:117] "RemoveContainer" containerID="ff627b4112d5afa4366e1239ee8cc64edf816c0effeb7dafc7e69ae5cc1194ec" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.148829 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" event={"ID":"35854fe5-2e29-4a49-9783-873bee1058e2","Type":"ContainerDied","Data":"d4a9bafe329a5dd29564e56dc6c12b0bb627050c8e4b71bc14df993ef884ad8a"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.148915 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.150066 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.156823 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35854fe5-2e29-4a49-9783-873bee1058e2" (UID: "35854fe5-2e29-4a49-9783-873bee1058e2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.156921 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-config-data" (OuterVolumeSpecName: "config-data") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.159444 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9441-account-delete-zxr9g" event={"ID":"c07072cb-ae19-4dcb-9f52-432fe923949d","Type":"ContainerStarted","Data":"95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.159578 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican9441-account-delete-zxr9g" podUID="c07072cb-ae19-4dcb-9f52-432fe923949d" containerName="mariadb-account-delete" containerID="cri-o://95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75" gracePeriod=30 Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.165572 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.170898 4822 generic.go:334] "Generic (PLEG): container finished" podID="e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" containerID="a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8" exitCode=0 Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.170959 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f","Type":"ContainerDied","Data":"a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.170982 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f","Type":"ContainerDied","Data":"29ea42e8823ceb5ead644f658fee9bfd0f2ceee1d79139c6f2ba13e3b3147a27"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.172822 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.174072 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell11f81-account-delete-mz2rx" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.174775 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.185857 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0da5e90-c960-4d67-9c19-6854f61dee14","Type":"ContainerDied","Data":"e616bfd3426a5248dc7b7c41864dc2d03e683a80dd5bd1aa85eafddc7d8adda0"} Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.186050 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.186569 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.198555 4822 scope.go:117] "RemoveContainer" containerID="0f9a24b9406d92324b02f103bd4f78da68951fba025fb39c169f20c3b804f0f8" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212033 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5180126-ac55-464c-90dd-565daffba54c" (UID: "d5180126-ac55-464c-90dd-565daffba54c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212517 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-public-tls-certs\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212595 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g77z\" (UniqueName: \"kubernetes.io/projected/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9-kube-api-access-5g77z\") pod \"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9\" (UID: \"ba1c43c3-6e05-459e-9692-b8ddeb17e0e9\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212623 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-combined-ca-bundle\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212651 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212730 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v4px\" (UniqueName: \"kubernetes.io/projected/09b24550-0f5f-46ff-bf11-192fa1f15650-kube-api-access-6v4px\") pod \"09b24550-0f5f-46ff-bf11-192fa1f15650\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212790 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-httpd-run\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212858 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-config-data\") pod \"09b24550-0f5f-46ff-bf11-192fa1f15650\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212907 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-combined-ca-bundle\") pod \"09b24550-0f5f-46ff-bf11-192fa1f15650\" (UID: \"09b24550-0f5f-46ff-bf11-192fa1f15650\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.212961 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-logs\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213040 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213071 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m4r9\" (UniqueName: \"kubernetes.io/projected/14ce9853-109f-456d-b51c-b1d11072a90d-kube-api-access-2m4r9\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213118 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts\") pod \"14ce9853-109f-456d-b51c-b1d11072a90d\" (UID: \"14ce9853-109f-456d-b51c-b1d11072a90d\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213578 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213602 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213614 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213625 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213635 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5180126-ac55-464c-90dd-565daffba54c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213645 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0da5e90-c960-4d67-9c19-6854f61dee14-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.213655 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35854fe5-2e29-4a49-9783-873bee1058e2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.213713 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.213761 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data podName:1fa59157-6b4e-4379-89e0-415e74c581a8 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:18.213744201 +0000 UTC m=+1385.308902397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data") pod "rabbitmq-cell1-server-0" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8") : configmap "rabbitmq-cell1-config-data" not found Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.223305 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-logs" (OuterVolumeSpecName: "logs") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.226683 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "afb8479c-8058-4a41-9bc7-8fd09bd321d8" (UID: "afb8479c-8058-4a41-9bc7-8fd09bd321d8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.228491 4822 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.228546 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data podName:48fba34a-0289-41f0-b1d7-bb71a22253a3 nodeName:}" failed. No retries permitted until 2025-10-10 06:47:18.228529407 +0000 UTC m=+1385.323687603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data") pod "rabbitmq-server-0" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3") : configmap "rabbitmq-config-data" not found Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.230516 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.236604 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican9441-account-delete-zxr9g" podStartSLOduration=8.236580189 podStartE2EDuration="8.236580189s" podCreationTimestamp="2025-10-10 06:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:47:10.232325116 +0000 UTC m=+1377.327483312" watchObservedRunningTime="2025-10-10 06:47:10.236580189 +0000 UTC m=+1377.331738385" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.244708 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.244840 4822 scope.go:117] "RemoveContainer" containerID="9baba77c235fa2da53a1933646bcf77963daf0d1bceccb6370d40760af07bd34" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.264995 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ce9853-109f-456d-b51c-b1d11072a90d-kube-api-access-2m4r9" (OuterVolumeSpecName: "kube-api-access-2m4r9") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "kube-api-access-2m4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.265564 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts" (OuterVolumeSpecName: "scripts") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.269929 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b5d11cec-09a9-4adc-9889-cc90f8b983e1" (UID: "b5d11cec-09a9-4adc-9889-cc90f8b983e1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.270507 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data" (OuterVolumeSpecName: "config-data") pod "f0da7840-eaa9-46a7-bda6-5de928993572" (UID: "f0da7840-eaa9-46a7-bda6-5de928993572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.274160 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b24550-0f5f-46ff-bf11-192fa1f15650-kube-api-access-6v4px" (OuterVolumeSpecName: "kube-api-access-6v4px") pod "09b24550-0f5f-46ff-bf11-192fa1f15650" (UID: "09b24550-0f5f-46ff-bf11-192fa1f15650"). InnerVolumeSpecName "kube-api-access-6v4px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.301815 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9-kube-api-access-5g77z" (OuterVolumeSpecName: "kube-api-access-5g77z") pod "ba1c43c3-6e05-459e-9692-b8ddeb17e0e9" (UID: "ba1c43c3-6e05-459e-9692-b8ddeb17e0e9"). InnerVolumeSpecName "kube-api-access-5g77z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.307631 4822 scope.go:117] "RemoveContainer" containerID="080914d126025b6e485b39c65b1fd29e1335110a9ec2a7216a930a1f2cc0164e" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.325750 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-combined-ca-bundle\") pod \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.325950 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-memcached-tls-certs\") pod \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.325986 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kolla-config\") pod \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.326062 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvh9\" (UniqueName: \"kubernetes.io/projected/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kube-api-access-4jvh9\") pod \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.327304 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-config-data\") pod \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\" (UID: \"e15a2ac4-8be3-40a0-8f79-56c0ad8b034f\") " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.329443 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" (UID: "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332581 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332724 4822 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332742 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m4r9\" (UniqueName: \"kubernetes.io/projected/14ce9853-109f-456d-b51c-b1d11072a90d-kube-api-access-2m4r9\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332752 4822 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/afb8479c-8058-4a41-9bc7-8fd09bd321d8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332761 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332781 4822 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d11cec-09a9-4adc-9889-cc90f8b983e1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332795 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g77z\" (UniqueName: \"kubernetes.io/projected/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9-kube-api-access-5g77z\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332830 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332842 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v4px\" (UniqueName: \"kubernetes.io/projected/09b24550-0f5f-46ff-bf11-192fa1f15650-kube-api-access-6v4px\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332857 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0da7840-eaa9-46a7-bda6-5de928993572-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.332866 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14ce9853-109f-456d-b51c-b1d11072a90d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.334317 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-config-data" (OuterVolumeSpecName: "config-data") pod "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" (UID: "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.347046 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kube-api-access-4jvh9" (OuterVolumeSpecName: "kube-api-access-4jvh9") pod "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" (UID: "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f"). InnerVolumeSpecName "kube-api-access-4jvh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.349533 4822 scope.go:117] "RemoveContainer" containerID="ad3e61286e546aa785f998bc78158faaf28a9227d533f3ce1eeace9ba41ab652" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.356638 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell11f81-account-delete-mz2rx"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.361638 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.375924 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-config-data" (OuterVolumeSpecName: "config-data") pod "09b24550-0f5f-46ff-bf11-192fa1f15650" (UID: "09b24550-0f5f-46ff-bf11-192fa1f15650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.377926 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell11f81-account-delete-mz2rx"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.378136 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data" (OuterVolumeSpecName: "config-data") pod "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" (UID: "11c95228-48ad-4e25-9cf7-bf0a2a1e4c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.388420 4822 scope.go:117] "RemoveContainer" containerID="fda58ea5f9b88b81e2eae62a1675670199fddb3e9d024a70d2a8d75abe7fbe9f" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.389522 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.394784 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.398999 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.401636 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data" (OuterVolumeSpecName: "config-data") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.403124 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09b24550-0f5f-46ff-bf11-192fa1f15650" (UID: "09b24550-0f5f-46ff-bf11-192fa1f15650"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.404497 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.411909 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.415703 4822 scope.go:117] "RemoveContainer" containerID="a1174022bfa90fdbbc7bdb6448ded60ea8ea9a3effbb5e0c5631a9b1f7bfe1e1" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.427946 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" (UID: "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.433950 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvh9\" (UniqueName: \"kubernetes.io/projected/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-kube-api-access-4jvh9\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.433973 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.433982 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.433992 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.434001 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.434009 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.434017 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.434026 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09b24550-0f5f-46ff-bf11-192fa1f15650-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.434034 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.434105 4822 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.434122 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.434130 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.434141 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.434178 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:47:18.43416547 +0000 UTC m=+1385.529323666 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.437745 4822 scope.go:117] "RemoveContainer" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.439942 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "14ce9853-109f-456d-b51c-b1d11072a90d" (UID: "14ce9853-109f-456d-b51c-b1d11072a90d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.457643 4822 scope.go:117] "RemoveContainer" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.457736 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibdb6-account-delete-qgrcb"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.459627 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca\": container with ID starting with 136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca not found: ID does not exist" containerID="136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.459669 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca"} err="failed to get container status \"136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca\": rpc error: code = NotFound desc = could not find container \"136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca\": container with ID starting with 136b214956267282c0795fa7d7a563428135b4927e211d58b042bccf93eab6ca not found: ID does not exist" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.459696 4822 scope.go:117] "RemoveContainer" containerID="d17675142c2329604a137158115089b304e6ce04d8ec7938439c69f02db6cd14" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.464826 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" (UID: "e15a2ac4-8be3-40a0-8f79-56c0ad8b034f"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.472775 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapibdb6-account-delete-qgrcb"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.479584 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.483652 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.496080 4822 scope.go:117] "RemoveContainer" containerID="489b35ea36a581d0ca50df223a61ef6883583aa6d15472a109b5513aa80e1f47" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.536049 4822 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.537026 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ce9853-109f-456d-b51c-b1d11072a90d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.563303 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.579180 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.585054 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.588010 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.593304 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.593353 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="galera" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.608006 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.633354 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.648604 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b5444654f-5wp86"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.674323 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6b5444654f-5wp86"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.691626 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c8fcfdd4-4gk9s"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.708990 4822 scope.go:117] "RemoveContainer" containerID="a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.711789 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c8fcfdd4-4gk9s"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.738056 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.743026 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.750859 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.751118 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.752965 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.753096 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.753207 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerName="nova-cell1-conductor-conductor" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.758603 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.845190 4822 scope.go:117] "RemoveContainer" containerID="a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8" Oct 10 06:47:10 crc kubenswrapper[4822]: E1010 06:47:10.845894 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8\": container with ID starting with a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8 not found: ID does not exist" containerID="a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.845943 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8"} err="failed to get container status \"a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8\": rpc error: code = NotFound desc = could not find container \"a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8\": container with ID starting with a30c45675c4205e8b13df797f2d0d2ca651485ad2e2ffe30146af6975c8caea8 not found: ID does not exist" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.845976 4822 scope.go:117] "RemoveContainer" containerID="c06e8d4a5f40e2de362c673f9a2482416e108f4360a178ff0fccd0dcaf4c36e9" Oct 10 06:47:10 crc kubenswrapper[4822]: I1010 06:47:10.902050 4822 scope.go:117] "RemoveContainer" containerID="3eab14a45cceb25e06794fdc7e4a763b3c3af3f0e97c2df5eb1c6cd89012256a" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.155570 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.200:3000/\": dial tcp 10.217.0.200:3000: connect: connection refused" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.190302 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.203657 4822 generic.go:334] "Generic (PLEG): container finished" podID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerID="a829a2721fe99b524e5ca7cb2318d1332bb2968f94f80cd142fd5a74891aa843" exitCode=0 Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.203740 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1fa59157-6b4e-4379-89e0-415e74c581a8","Type":"ContainerDied","Data":"a829a2721fe99b524e5ca7cb2318d1332bb2968f94f80cd142fd5a74891aa843"} Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.203769 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1fa59157-6b4e-4379-89e0-415e74c581a8","Type":"ContainerDied","Data":"3cb6b0d8598096cfa4a9302c0c53a869fca682262baf741257f4945f73e789e9"} Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.203785 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb6b0d8598096cfa4a9302c0c53a869fca682262baf741257f4945f73e789e9" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.217380 4822 generic.go:334] "Generic (PLEG): container finished" podID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerID="b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a" exitCode=0 Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.217458 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ebe2e09c-1139-449c-919b-206fbe0614ab","Type":"ContainerDied","Data":"b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a"} Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.221932 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.227280 4822 generic.go:334] "Generic (PLEG): container finished" podID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerID="65212beb044396c22b0fae65dc35f02089f6f4279e19167d0de48c69116a6853" exitCode=0 Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.227323 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48fba34a-0289-41f0-b1d7-bb71a22253a3","Type":"ContainerDied","Data":"65212beb044396c22b0fae65dc35f02089f6f4279e19167d0de48c69116a6853"} Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.230671 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.268033 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.293381 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.336428 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.348121 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355344 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bxh6\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-kube-api-access-5bxh6\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355387 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-plugins-conf\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355447 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355471 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355494 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-server-conf\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355510 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-confd\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355525 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-tls\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355555 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fa59157-6b4e-4379-89e0-415e74c581a8-erlang-cookie-secret\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355575 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fa59157-6b4e-4379-89e0-415e74c581a8-pod-info\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355606 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-plugins\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.355652 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-erlang-cookie\") pod \"1fa59157-6b4e-4379-89e0-415e74c581a8\" (UID: \"1fa59157-6b4e-4379-89e0-415e74c581a8\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.356498 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.357101 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.357792 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.362221 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-kube-api-access-5bxh6" (OuterVolumeSpecName: "kube-api-access-5bxh6") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "kube-api-access-5bxh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.369721 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa59157-6b4e-4379-89e0-415e74c581a8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.370551 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.371944 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1fa59157-6b4e-4379-89e0-415e74c581a8-pod-info" (OuterVolumeSpecName: "pod-info") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.373892 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.387987 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.400239 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data" (OuterVolumeSpecName: "config-data") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.405041 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-server-conf" (OuterVolumeSpecName: "server-conf") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457578 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bxh6\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-kube-api-access-5bxh6\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457606 4822 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457615 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457643 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457653 4822 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fa59157-6b4e-4379-89e0-415e74c581a8-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457661 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457672 4822 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fa59157-6b4e-4379-89e0-415e74c581a8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457679 4822 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fa59157-6b4e-4379-89e0-415e74c581a8-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457687 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.457695 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.478485 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.495189 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1fa59157-6b4e-4379-89e0-415e74c581a8" (UID: "1fa59157-6b4e-4379-89e0-415e74c581a8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558334 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-secrets\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558427 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558458 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-default\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558481 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-combined-ca-bundle\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558509 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-generated\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558542 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-kolla-config\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558580 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-operator-scripts\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558610 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-galera-tls-certs\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558626 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7s5\" (UniqueName: \"kubernetes.io/projected/ebe2e09c-1139-449c-919b-206fbe0614ab-kube-api-access-kx7s5\") pod \"ebe2e09c-1139-449c-919b-206fbe0614ab\" (UID: \"ebe2e09c-1139-449c-919b-206fbe0614ab\") " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558931 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.558942 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fa59157-6b4e-4379-89e0-415e74c581a8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.559054 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.559604 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.560317 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.560745 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.573768 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-secrets" (OuterVolumeSpecName: "secrets") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.573790 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe2e09c-1139-449c-919b-206fbe0614ab-kube-api-access-kx7s5" (OuterVolumeSpecName: "kube-api-access-kx7s5") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "kube-api-access-kx7s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.602511 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.603589 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.617088 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "ebe2e09c-1139-449c-919b-206fbe0614ab" (UID: "ebe2e09c-1139-449c-919b-206fbe0614ab"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.660921 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.660959 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.660972 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.660984 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ebe2e09c-1139-449c-919b-206fbe0614ab-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.660995 4822 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.661005 4822 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe2e09c-1139-449c-919b-206fbe0614ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.661015 4822 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.661028 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7s5\" (UniqueName: \"kubernetes.io/projected/ebe2e09c-1139-449c-919b-206fbe0614ab-kube-api-access-kx7s5\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.661038 4822 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ebe2e09c-1139-449c-919b-206fbe0614ab-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.661116 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b24550-0f5f-46ff-bf11-192fa1f15650" path="/var/lib/kubelet/pods/09b24550-0f5f-46ff-bf11-192fa1f15650/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.661696 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" path="/var/lib/kubelet/pods/11c95228-48ad-4e25-9cf7-bf0a2a1e4c69/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.662337 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" path="/var/lib/kubelet/pods/14ce9853-109f-456d-b51c-b1d11072a90d/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.663770 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" path="/var/lib/kubelet/pods/35854fe5-2e29-4a49-9783-873bee1058e2/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.678556 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2b7a8e-ab63-4d56-929e-1e6898294956" path="/var/lib/kubelet/pods/6c2b7a8e-ab63-4d56-929e-1e6898294956/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.683464 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb8479c-8058-4a41-9bc7-8fd09bd321d8" path="/var/lib/kubelet/pods/afb8479c-8058-4a41-9bc7-8fd09bd321d8/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.684040 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" path="/var/lib/kubelet/pods/b5d11cec-09a9-4adc-9889-cc90f8b983e1/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.685330 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba1c43c3-6e05-459e-9692-b8ddeb17e0e9" path="/var/lib/kubelet/pods/ba1c43c3-6e05-459e-9692-b8ddeb17e0e9/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.686731 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" path="/var/lib/kubelet/pods/c0da5e90-c960-4d67-9c19-6854f61dee14/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.687683 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5180126-ac55-464c-90dd-565daffba54c" path="/var/lib/kubelet/pods/d5180126-ac55-464c-90dd-565daffba54c/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.689812 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.696471 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" path="/var/lib/kubelet/pods/e15a2ac4-8be3-40a0-8f79-56c0ad8b034f/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.697573 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" path="/var/lib/kubelet/pods/e5c43dc5-0a44-497d-8d7c-3a818ddf1735/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.698765 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" path="/var/lib/kubelet/pods/f0da7840-eaa9-46a7-bda6-5de928993572/volumes" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.763249 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:11 crc kubenswrapper[4822]: E1010 06:47:11.766205 4822 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 10 06:47:11 crc kubenswrapper[4822]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-10T06:47:04Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 10 06:47:11 crc kubenswrapper[4822]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 10 06:47:11 crc kubenswrapper[4822]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-5jdbx" message=< Oct 10 06:47:11 crc kubenswrapper[4822]: Exiting ovn-controller (1) [FAILED] Oct 10 06:47:11 crc kubenswrapper[4822]: Killing ovn-controller (1) [ OK ] Oct 10 06:47:11 crc kubenswrapper[4822]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 10 06:47:11 crc kubenswrapper[4822]: 2025-10-10T06:47:04Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 10 06:47:11 crc kubenswrapper[4822]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 10 06:47:11 crc kubenswrapper[4822]: > Oct 10 06:47:11 crc kubenswrapper[4822]: E1010 06:47:11.766243 4822 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 10 06:47:11 crc kubenswrapper[4822]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-10T06:47:04Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 10 06:47:11 crc kubenswrapper[4822]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 10 06:47:11 crc kubenswrapper[4822]: > pod="openstack/ovn-controller-5jdbx" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" containerID="cri-o://8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628" Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.766281 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-5jdbx" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" containerID="cri-o://8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628" gracePeriod=22 Oct 10 06:47:11 crc kubenswrapper[4822]: I1010 06:47:11.867023 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972321 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-server-conf\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972365 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48fba34a-0289-41f0-b1d7-bb71a22253a3-pod-info\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972393 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972456 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-confd\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972479 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-plugins\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972530 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972556 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-plugins-conf\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972587 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48fba34a-0289-41f0-b1d7-bb71a22253a3-erlang-cookie-secret\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972620 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-erlang-cookie\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972636 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tc4\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-kube-api-access-h2tc4\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.972651 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-tls\") pod \"48fba34a-0289-41f0-b1d7-bb71a22253a3\" (UID: \"48fba34a-0289-41f0-b1d7-bb71a22253a3\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.973316 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.977223 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.978289 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.980005 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/48fba34a-0289-41f0-b1d7-bb71a22253a3-pod-info" (OuterVolumeSpecName: "pod-info") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.980408 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.982542 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fba34a-0289-41f0-b1d7-bb71a22253a3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.982735 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.988287 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-kube-api-access-h2tc4" (OuterVolumeSpecName: "kube-api-access-h2tc4") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "kube-api-access-h2tc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:11.996991 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data" (OuterVolumeSpecName: "config-data") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.021837 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-server-conf" (OuterVolumeSpecName: "server-conf") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.044571 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074015 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074331 4822 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074342 4822 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48fba34a-0289-41f0-b1d7-bb71a22253a3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074352 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074361 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tc4\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-kube-api-access-h2tc4\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074370 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074379 4822 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074387 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48fba34a-0289-41f0-b1d7-bb71a22253a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074394 4822 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48fba34a-0289-41f0-b1d7-bb71a22253a3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.074402 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.088037 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "48fba34a-0289-41f0-b1d7-bb71a22253a3" (UID: "48fba34a-0289-41f0-b1d7-bb71a22253a3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.131775 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175243 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-internal-tls-certs\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175298 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-credential-keys\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175324 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-config-data\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175413 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pxn5\" (UniqueName: \"kubernetes.io/projected/dc3ce4fd-4bba-4242-91ba-076cf3729770-kube-api-access-4pxn5\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175432 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-scripts\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175463 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-fernet-keys\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175522 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-combined-ca-bundle\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175570 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-public-tls-certs\") pod \"dc3ce4fd-4bba-4242-91ba-076cf3729770\" (UID: \"dc3ce4fd-4bba-4242-91ba-076cf3729770\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175897 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48fba34a-0289-41f0-b1d7-bb71a22253a3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.175915 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.182358 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-scripts" (OuterVolumeSpecName: "scripts") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.183976 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.184909 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.187172 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3ce4fd-4bba-4242-91ba-076cf3729770-kube-api-access-4pxn5" (OuterVolumeSpecName: "kube-api-access-4pxn5") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "kube-api-access-4pxn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.203970 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.217649 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5jdbx_27c9c088-64aa-44cd-8e1d-5e007e0d309b/ovn-controller/0.log" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.217718 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.224062 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-config-data" (OuterVolumeSpecName: "config-data") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.229907 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.235900 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc3ce4fd-4bba-4242-91ba-076cf3729770" (UID: "dc3ce4fd-4bba-4242-91ba-076cf3729770"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284290 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284333 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284345 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284355 4822 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284366 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284379 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pxn5\" (UniqueName: \"kubernetes.io/projected/dc3ce4fd-4bba-4242-91ba-076cf3729770-kube-api-access-4pxn5\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284390 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.284400 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc3ce4fd-4bba-4242-91ba-076cf3729770-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.288040 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ebe2e09c-1139-449c-919b-206fbe0614ab","Type":"ContainerDied","Data":"10897d820ebbf299d03df7a8376ecfb2b3e358c5a11ed6b377f9b97862f6ed5c"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.288087 4822 scope.go:117] "RemoveContainer" containerID="b00a54806cd96976ca6bd2fdcc9b4301260b66774ffe1c7de0aa58b81328d35a" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.288232 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.293915 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5jdbx_27c9c088-64aa-44cd-8e1d-5e007e0d309b/ovn-controller/0.log" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.293963 4822 generic.go:334] "Generic (PLEG): container finished" podID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerID="8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628" exitCode=137 Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.294028 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx" event={"ID":"27c9c088-64aa-44cd-8e1d-5e007e0d309b","Type":"ContainerDied","Data":"8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.294055 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jdbx" event={"ID":"27c9c088-64aa-44cd-8e1d-5e007e0d309b","Type":"ContainerDied","Data":"26207bf121036d4614299d8c3674c766a097beef5ad362bb997783a2cb1eae80"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.294945 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jdbx" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.298063 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.298067 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48fba34a-0289-41f0-b1d7-bb71a22253a3","Type":"ContainerDied","Data":"1feeb41b38d8c35a54ee7dcc51b31b38edd8105b60ae5790db2e613cd851ac04"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.301593 4822 generic.go:334] "Generic (PLEG): container finished" podID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" exitCode=0 Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.301669 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7ab4fbc-298d-4250-bf01-a73155f35532","Type":"ContainerDied","Data":"6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.315145 4822 generic.go:334] "Generic (PLEG): container finished" podID="dc3ce4fd-4bba-4242-91ba-076cf3729770" containerID="aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6" exitCode=0 Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.315370 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b98d46cf9-66pcm" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.315499 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b98d46cf9-66pcm" event={"ID":"dc3ce4fd-4bba-4242-91ba-076cf3729770","Type":"ContainerDied","Data":"aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.315624 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b98d46cf9-66pcm" event={"ID":"dc3ce4fd-4bba-4242-91ba-076cf3729770","Type":"ContainerDied","Data":"42176a300fc8047fe12fa1c102b7a1082e60513284de2910e9936d11a415c9bc"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.315874 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.322100 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.329720 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f48ef71-8ab0-4ed4-a58c-78046ec184b6/ovn-northd/0.log" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.329753 4822 generic.go:334] "Generic (PLEG): container finished" podID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerID="964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569" exitCode=139 Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.329847 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.330545 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f48ef71-8ab0-4ed4-a58c-78046ec184b6","Type":"ContainerDied","Data":"964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569"} Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.335653 4822 scope.go:117] "RemoveContainer" containerID="145590554ea08262793a80e00ae8eeb2daf15d2d75801a13902413f227da8393" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.358908 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.371917 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385666 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tk9\" (UniqueName: \"kubernetes.io/projected/27c9c088-64aa-44cd-8e1d-5e007e0d309b-kube-api-access-h2tk9\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385841 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27c9c088-64aa-44cd-8e1d-5e007e0d309b-scripts\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385860 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-log-ovn\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385885 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-ovn-controller-tls-certs\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385903 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-combined-ca-bundle\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385949 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.385973 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run-ovn\") pod \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\" (UID: \"27c9c088-64aa-44cd-8e1d-5e007e0d309b\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.386295 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.388789 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run" (OuterVolumeSpecName: "var-run") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.388892 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.389947 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c9c088-64aa-44cd-8e1d-5e007e0d309b-scripts" (OuterVolumeSpecName: "scripts") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.391077 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5b98d46cf9-66pcm"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.393088 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c9c088-64aa-44cd-8e1d-5e007e0d309b-kube-api-access-h2tk9" (OuterVolumeSpecName: "kube-api-access-h2tk9") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "kube-api-access-h2tk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.393792 4822 scope.go:117] "RemoveContainer" containerID="8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.405417 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5b98d46cf9-66pcm"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.416101 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f48ef71-8ab0-4ed4-a58c-78046ec184b6/ovn-northd/0.log" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.416183 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.416208 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.421378 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.421686 4822 scope.go:117] "RemoveContainer" containerID="8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628" Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.422074 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628\": container with ID starting with 8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628 not found: ID does not exist" containerID="8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.422173 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628"} err="failed to get container status \"8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628\": rpc error: code = NotFound desc = could not find container \"8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628\": container with ID starting with 8c5489ace61492e01ee80dc9f8a3565a1f423448dfeaffbc516e7b1c67717628 not found: ID does not exist" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.422280 4822 scope.go:117] "RemoveContainer" containerID="65212beb044396c22b0fae65dc35f02089f6f4279e19167d0de48c69116a6853" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.423464 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.480913 4822 scope.go:117] "RemoveContainer" containerID="32c5a9d6afcaeb38dc9800deb4a9ed3a02f085527940c8caf49522b1f31a6f55" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.488279 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27c9c088-64aa-44cd-8e1d-5e007e0d309b-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.488304 4822 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.488313 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.488323 4822 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.488331 4822 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/27c9c088-64aa-44cd-8e1d-5e007e0d309b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.488339 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tk9\" (UniqueName: \"kubernetes.io/projected/27c9c088-64aa-44cd-8e1d-5e007e0d309b-kube-api-access-h2tk9\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.494685 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "27c9c088-64aa-44cd-8e1d-5e007e0d309b" (UID: "27c9c088-64aa-44cd-8e1d-5e007e0d309b"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.588900 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-northd-tls-certs\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589020 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-rundir\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589049 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-metrics-certs-tls-certs\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589109 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-config\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589136 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-scripts\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589190 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdqmg\" (UniqueName: \"kubernetes.io/projected/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-kube-api-access-wdqmg\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589212 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-combined-ca-bundle\") pod \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\" (UID: \"3f48ef71-8ab0-4ed4-a58c-78046ec184b6\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.589754 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.590027 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-config" (OuterVolumeSpecName: "config") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.590185 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-scripts" (OuterVolumeSpecName: "scripts") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.590743 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.590769 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c9c088-64aa-44cd-8e1d-5e007e0d309b-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.590784 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.590912 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.593066 4822 scope.go:117] "RemoveContainer" containerID="aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.615300 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-kube-api-access-wdqmg" (OuterVolumeSpecName: "kube-api-access-wdqmg") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "kube-api-access-wdqmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.650897 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.657277 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.683950 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "3f48ef71-8ab0-4ed4-a58c-78046ec184b6" (UID: "3f48ef71-8ab0-4ed4-a58c-78046ec184b6"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.692362 4822 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.692388 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdqmg\" (UniqueName: \"kubernetes.io/projected/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-kube-api-access-wdqmg\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.692401 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.692412 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f48ef71-8ab0-4ed4-a58c-78046ec184b6-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.729869 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.754503 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5jdbx"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.756546 4822 scope.go:117] "RemoveContainer" containerID="aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6" Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.758270 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6\": container with ID starting with aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6 not found: ID does not exist" containerID="aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.758305 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6"} err="failed to get container status \"aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6\": rpc error: code = NotFound desc = could not find container \"aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6\": container with ID starting with aa28ae0826a29200a2d29ca777871993930ca39a4916e694e41a0baa489772a6 not found: ID does not exist" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.764128 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5jdbx"] Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.800740 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.894995 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt7nc\" (UniqueName: \"kubernetes.io/projected/c7ab4fbc-298d-4250-bf01-a73155f35532-kube-api-access-qt7nc\") pod \"c7ab4fbc-298d-4250-bf01-a73155f35532\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895049 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-config-data\") pod \"c7ab4fbc-298d-4250-bf01-a73155f35532\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895072 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data\") pod \"3d602476-cde4-435f-93bc-a72c137d1b58\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895117 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhhbs\" (UniqueName: \"kubernetes.io/projected/3d602476-cde4-435f-93bc-a72c137d1b58-kube-api-access-zhhbs\") pod \"3d602476-cde4-435f-93bc-a72c137d1b58\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895183 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-combined-ca-bundle\") pod \"c7ab4fbc-298d-4250-bf01-a73155f35532\" (UID: \"c7ab4fbc-298d-4250-bf01-a73155f35532\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895241 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d602476-cde4-435f-93bc-a72c137d1b58-logs\") pod \"3d602476-cde4-435f-93bc-a72c137d1b58\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895345 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data-custom\") pod \"3d602476-cde4-435f-93bc-a72c137d1b58\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.895371 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-combined-ca-bundle\") pod \"3d602476-cde4-435f-93bc-a72c137d1b58\" (UID: \"3d602476-cde4-435f-93bc-a72c137d1b58\") " Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.897023 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d602476-cde4-435f-93bc-a72c137d1b58-logs" (OuterVolumeSpecName: "logs") pod "3d602476-cde4-435f-93bc-a72c137d1b58" (UID: "3d602476-cde4-435f-93bc-a72c137d1b58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.899039 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d602476-cde4-435f-93bc-a72c137d1b58-kube-api-access-zhhbs" (OuterVolumeSpecName: "kube-api-access-zhhbs") pod "3d602476-cde4-435f-93bc-a72c137d1b58" (UID: "3d602476-cde4-435f-93bc-a72c137d1b58"). InnerVolumeSpecName "kube-api-access-zhhbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.899645 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d602476-cde4-435f-93bc-a72c137d1b58" (UID: "3d602476-cde4-435f-93bc-a72c137d1b58"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.901185 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ab4fbc-298d-4250-bf01-a73155f35532-kube-api-access-qt7nc" (OuterVolumeSpecName: "kube-api-access-qt7nc") pod "c7ab4fbc-298d-4250-bf01-a73155f35532" (UID: "c7ab4fbc-298d-4250-bf01-a73155f35532"). InnerVolumeSpecName "kube-api-access-qt7nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.933016 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.933784 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.933872 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.935067 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.935214 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.935281 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.948552 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:12 crc kubenswrapper[4822]: E1010 06:47:12.948612 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.949167 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d602476-cde4-435f-93bc-a72c137d1b58" (UID: "3d602476-cde4-435f-93bc-a72c137d1b58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.949325 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-config-data" (OuterVolumeSpecName: "config-data") pod "c7ab4fbc-298d-4250-bf01-a73155f35532" (UID: "c7ab4fbc-298d-4250-bf01-a73155f35532"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.950503 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7ab4fbc-298d-4250-bf01-a73155f35532" (UID: "c7ab4fbc-298d-4250-bf01-a73155f35532"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:12 crc kubenswrapper[4822]: I1010 06:47:12.983919 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data" (OuterVolumeSpecName: "config-data") pod "3d602476-cde4-435f-93bc-a72c137d1b58" (UID: "3d602476-cde4-435f-93bc-a72c137d1b58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.007947 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d602476-cde4-435f-93bc-a72c137d1b58-logs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.007992 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.008007 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.008019 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt7nc\" (UniqueName: \"kubernetes.io/projected/c7ab4fbc-298d-4250-bf01-a73155f35532-kube-api-access-qt7nc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.008031 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.008044 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d602476-cde4-435f-93bc-a72c137d1b58-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.008055 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhhbs\" (UniqueName: \"kubernetes.io/projected/3d602476-cde4-435f-93bc-a72c137d1b58-kube-api-access-zhhbs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.008066 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ab4fbc-298d-4250-bf01-a73155f35532-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.340745 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.340818 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7ab4fbc-298d-4250-bf01-a73155f35532","Type":"ContainerDied","Data":"2be616b0d8528d462efbcbdc4480ebb3fbbfa728eac5db6dd39a36ed9132cbc1"} Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.341715 4822 scope.go:117] "RemoveContainer" containerID="6b737a1b4f872a095a270515815dc5f00920d87b7311a06598bf41f4034fd404" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.345751 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3f48ef71-8ab0-4ed4-a58c-78046ec184b6/ovn-northd/0.log" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.345874 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3f48ef71-8ab0-4ed4-a58c-78046ec184b6","Type":"ContainerDied","Data":"8c8d5b8afcf4ef3c8110883c26cf163cbbbe90ef7916c4a1e71d7f4f73f1ab4b"} Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.345914 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.349621 4822 generic.go:334] "Generic (PLEG): container finished" podID="3d602476-cde4-435f-93bc-a72c137d1b58" containerID="1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1" exitCode=0 Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.349686 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" event={"ID":"3d602476-cde4-435f-93bc-a72c137d1b58","Type":"ContainerDied","Data":"1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1"} Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.349710 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" event={"ID":"3d602476-cde4-435f-93bc-a72c137d1b58","Type":"ContainerDied","Data":"20f51c5a34c2dca09531387d9af017473fa5f79978b90eb4eb6296f3d3cdd1ab"} Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.349730 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-766cf74578-rdxjc" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.420716 4822 scope.go:117] "RemoveContainer" containerID="a699c14506381f4dd996d947b346eb98c9c53450eafb1392ee285ce66c591795" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.424876 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-766cf74578-rdxjc"] Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.430070 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-766cf74578-rdxjc"] Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.439747 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.464480 4822 scope.go:117] "RemoveContainer" containerID="964909e2cdc80f417c20b465405c46d8598b7ea72fd65c592ef2973b0b058569" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.479629 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.486083 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.493117 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.496489 4822 scope.go:117] "RemoveContainer" containerID="1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.522742 4822 scope.go:117] "RemoveContainer" containerID="72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.593926 4822 scope.go:117] "RemoveContainer" containerID="1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1" Oct 10 06:47:13 crc kubenswrapper[4822]: E1010 06:47:13.595211 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1\": container with ID starting with 1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1 not found: ID does not exist" containerID="1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.595233 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1"} err="failed to get container status \"1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1\": rpc error: code = NotFound desc = could not find container \"1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1\": container with ID starting with 1b0737b484ce16b5e0d8293204b45998675ea7023369d4ecca91787ef98479c1 not found: ID does not exist" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.595257 4822 scope.go:117] "RemoveContainer" containerID="72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a" Oct 10 06:47:13 crc kubenswrapper[4822]: E1010 06:47:13.595652 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a\": container with ID starting with 72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a not found: ID does not exist" containerID="72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.595688 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a"} err="failed to get container status \"72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a\": rpc error: code = NotFound desc = could not find container \"72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a\": container with ID starting with 72e3f11430969e8a211e2e5b54265f2d3f81f0c0f409628710b984fb6d5c049a not found: ID does not exist" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.661626 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.672405 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c7474d4d9-hl56q" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": dial tcp 10.217.0.161:9696: connect: connection refused" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.691295 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" path="/var/lib/kubelet/pods/1fa59157-6b4e-4379-89e0-415e74c581a8/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.692000 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" path="/var/lib/kubelet/pods/27c9c088-64aa-44cd-8e1d-5e007e0d309b/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.693384 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" path="/var/lib/kubelet/pods/3d602476-cde4-435f-93bc-a72c137d1b58/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.696288 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" path="/var/lib/kubelet/pods/3f48ef71-8ab0-4ed4-a58c-78046ec184b6/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.697029 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" path="/var/lib/kubelet/pods/48fba34a-0289-41f0-b1d7-bb71a22253a3/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.697952 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" path="/var/lib/kubelet/pods/c7ab4fbc-298d-4250-bf01-a73155f35532/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.698395 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3ce4fd-4bba-4242-91ba-076cf3729770" path="/var/lib/kubelet/pods/dc3ce4fd-4bba-4242-91ba-076cf3729770/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.698950 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" path="/var/lib/kubelet/pods/ebe2e09c-1139-449c-919b-206fbe0614ab/volumes" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818711 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d75b\" (UniqueName: \"kubernetes.io/projected/151eccad-6f76-476d-a2f4-53123f29bdb7-kube-api-access-7d75b\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-config-data\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818776 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-combined-ca-bundle\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818815 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-ceilometer-tls-certs\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818860 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-run-httpd\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818919 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-sg-core-conf-yaml\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818957 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-scripts\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.818978 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-log-httpd\") pod \"151eccad-6f76-476d-a2f4-53123f29bdb7\" (UID: \"151eccad-6f76-476d-a2f4-53123f29bdb7\") " Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.819555 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.819940 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.823206 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-scripts" (OuterVolumeSpecName: "scripts") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.823304 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151eccad-6f76-476d-a2f4-53123f29bdb7-kube-api-access-7d75b" (OuterVolumeSpecName: "kube-api-access-7d75b") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "kube-api-access-7d75b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.855313 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.860192 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.899832 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-config-data" (OuterVolumeSpecName: "config-data") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.900311 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "151eccad-6f76-476d-a2f4-53123f29bdb7" (UID: "151eccad-6f76-476d-a2f4-53123f29bdb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920433 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d75b\" (UniqueName: \"kubernetes.io/projected/151eccad-6f76-476d-a2f4-53123f29bdb7-kube-api-access-7d75b\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920472 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920486 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920499 4822 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920509 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920520 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920530 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/151eccad-6f76-476d-a2f4-53123f29bdb7-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:13 crc kubenswrapper[4822]: I1010 06:47:13.920541 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/151eccad-6f76-476d-a2f4-53123f29bdb7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.381035 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: i/o timeout" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.381364 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c8fcfdd4-4gk9s" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.403973 4822 generic.go:334] "Generic (PLEG): container finished" podID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerID="449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13" exitCode=0 Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.404238 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerDied","Data":"449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13"} Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.404363 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"151eccad-6f76-476d-a2f4-53123f29bdb7","Type":"ContainerDied","Data":"469406a00ece231f5e75c8f55955bf85d697a9eea052ec38a3ddac6cf51ad9a3"} Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.404521 4822 scope.go:117] "RemoveContainer" containerID="8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.404795 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.430996 4822 scope.go:117] "RemoveContainer" containerID="80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.439668 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.446201 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.479010 4822 scope.go:117] "RemoveContainer" containerID="449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.504838 4822 scope.go:117] "RemoveContainer" containerID="4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.526491 4822 scope.go:117] "RemoveContainer" containerID="8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9" Oct 10 06:47:14 crc kubenswrapper[4822]: E1010 06:47:14.527139 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9\": container with ID starting with 8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9 not found: ID does not exist" containerID="8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.527250 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9"} err="failed to get container status \"8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9\": rpc error: code = NotFound desc = could not find container \"8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9\": container with ID starting with 8b448cde9e873034e0ac60c58bbc08850bc1ed6b153e034b1d977b6d4a9b80a9 not found: ID does not exist" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.527340 4822 scope.go:117] "RemoveContainer" containerID="80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46" Oct 10 06:47:14 crc kubenswrapper[4822]: E1010 06:47:14.528433 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46\": container with ID starting with 80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46 not found: ID does not exist" containerID="80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.528490 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46"} err="failed to get container status \"80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46\": rpc error: code = NotFound desc = could not find container \"80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46\": container with ID starting with 80b23faadff772f107f152dcce9389ccec40a899adef3dee0a8b9738ba900b46 not found: ID does not exist" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.528524 4822 scope.go:117] "RemoveContainer" containerID="449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13" Oct 10 06:47:14 crc kubenswrapper[4822]: E1010 06:47:14.529480 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13\": container with ID starting with 449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13 not found: ID does not exist" containerID="449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.529507 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13"} err="failed to get container status \"449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13\": rpc error: code = NotFound desc = could not find container \"449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13\": container with ID starting with 449b7d0ee7375995a559d088b8434d7df8bd3aa35ae05919620a4d6be0386f13 not found: ID does not exist" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.529551 4822 scope.go:117] "RemoveContainer" containerID="4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f" Oct 10 06:47:14 crc kubenswrapper[4822]: E1010 06:47:14.529908 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f\": container with ID starting with 4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f not found: ID does not exist" containerID="4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f" Oct 10 06:47:14 crc kubenswrapper[4822]: I1010 06:47:14.529931 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f"} err="failed to get container status \"4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f\": rpc error: code = NotFound desc = could not find container \"4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f\": container with ID starting with 4fd92d764e17ad2f30f16fca73a2fceb2f99c2dbfe81ab07125a05d205804f2f not found: ID does not exist" Oct 10 06:47:15 crc kubenswrapper[4822]: I1010 06:47:15.661720 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" path="/var/lib/kubelet/pods/151eccad-6f76-476d-a2f4-53123f29bdb7/volumes" Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.913896 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.914596 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.914976 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.915011 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.915233 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.917336 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.918618 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:17 crc kubenswrapper[4822]: E1010 06:47:17.918674 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:18 crc kubenswrapper[4822]: E1010 06:47:18.489574 4822 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Oct 10 06:47:18 crc kubenswrapper[4822]: E1010 06:47:18.489830 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:18 crc kubenswrapper[4822]: E1010 06:47:18.489910 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:47:18 crc kubenswrapper[4822]: E1010 06:47:18.489975 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:18 crc kubenswrapper[4822]: E1010 06:47:18.490074 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:47:34.490057909 +0000 UTC m=+1401.585216105 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.917708 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.921533 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.921509 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.922436 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.922479 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.924634 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.928525 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:22 crc kubenswrapper[4822]: E1010 06:47:22.929095 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.077733 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2sw5"] Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078103 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078120 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-api" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078135 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerName="init" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078142 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerName="init" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078152 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="proxy-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078158 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="proxy-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078167 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078172 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-api" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078181 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078187 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078197 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="setup-container" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078203 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="setup-container" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078213 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078221 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078231 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="ovsdbserver-sb" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078237 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="ovsdbserver-sb" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078246 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="rabbitmq" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078252 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="rabbitmq" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078260 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="ovsdbserver-nb" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078266 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="ovsdbserver-nb" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078274 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3ce4fd-4bba-4242-91ba-076cf3729770" containerName="keystone-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078279 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3ce4fd-4bba-4242-91ba-076cf3729770" containerName="keystone-api" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078287 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078293 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078299 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078306 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078317 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078323 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078333 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078339 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078349 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerName="galera" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078355 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerName="galera" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078365 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078371 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078380 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b24550-0f5f-46ff-bf11-192fa1f15650" containerName="nova-cell0-conductor-conductor" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078385 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b24550-0f5f-46ff-bf11-192fa1f15650" containerName="nova-cell0-conductor-conductor" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078396 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="sg-core" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078402 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="sg-core" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078413 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1c43c3-6e05-459e-9692-b8ddeb17e0e9" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078420 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1c43c3-6e05-459e-9692-b8ddeb17e0e9" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078430 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078436 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078448 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078455 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078464 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-central-agent" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078469 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-central-agent" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078481 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078486 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078495 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="rabbitmq" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078501 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="rabbitmq" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078510 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="419c8ee7-56fd-43cc-86de-7f647c708502" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078516 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="419c8ee7-56fd-43cc-86de-7f647c708502" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078523 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerName="mysql-bootstrap" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078529 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerName="mysql-bootstrap" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078537 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" containerName="memcached" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078542 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" containerName="memcached" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078550 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="setup-container" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078555 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="setup-container" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078565 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078571 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078579 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078584 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078590 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-server" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078596 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-server" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078606 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" containerName="nova-scheduler-scheduler" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078612 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" containerName="nova-scheduler-scheduler" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078622 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241b1f65-5edb-4965-b9af-e8e12b73124c" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078628 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="241b1f65-5edb-4965-b9af-e8e12b73124c" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078637 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078643 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078653 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerName="nova-cell1-conductor-conductor" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078658 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerName="nova-cell1-conductor-conductor" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078668 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="probe" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078673 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="probe" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078680 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="cinder-scheduler" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078685 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="cinder-scheduler" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078694 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb8479c-8058-4a41-9bc7-8fd09bd321d8" containerName="kube-state-metrics" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078699 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb8479c-8058-4a41-9bc7-8fd09bd321d8" containerName="kube-state-metrics" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078708 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-metadata" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078713 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-metadata" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078720 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="galera" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078726 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="galera" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078736 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="mysql-bootstrap" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078742 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="mysql-bootstrap" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078751 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078757 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078767 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerName="dnsmasq-dns" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078773 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerName="dnsmasq-dns" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078782 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078787 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078796 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-notification-agent" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078846 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-notification-agent" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078855 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa25f12-05b2-4632-921a-d126059a63be" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078861 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa25f12-05b2-4632-921a-d126059a63be" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078868 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2b7a8e-ab63-4d56-929e-1e6898294956" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078874 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2b7a8e-ab63-4d56-929e-1e6898294956" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078881 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078886 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078893 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078899 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078907 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078912 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078919 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078925 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078934 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078939 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: E1010 06:47:23.078947 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.078952 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079083 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa59157-6b4e-4379-89e0-415e74c581a8" containerName="rabbitmq" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079096 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079109 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079125 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079139 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c9c088-64aa-44cd-8e1d-5e007e0d309b" containerName="ovn-controller" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079149 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079159 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ab4fbc-298d-4250-bf01-a73155f35532" containerName="nova-cell1-conductor-conductor" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079167 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079181 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2b7a8e-ab63-4d56-929e-1e6898294956" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079195 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="999b3a9f-9559-4baa-9f36-4f91631fb1fc" containerName="dnsmasq-dns" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079207 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079217 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe2e09c-1139-449c-919b-206fbe0614ab" containerName="galera" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079230 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079241 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="proxy-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079254 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="22236db0-c666-44e4-a290-66626e76cdad" containerName="ovsdbserver-sb" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079265 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" containerName="proxy-server" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079273 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079286 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c43dc5-0a44-497d-8d7c-3a818ddf1735" containerName="galera" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079294 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="108483bc-0a52-4ac2-8086-fa89466ea3aa" containerName="ovsdbserver-nb" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079301 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b24550-0f5f-46ff-bf11-192fa1f15650" containerName="nova-cell0-conductor-conductor" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079314 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079322 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fba34a-0289-41f0-b1d7-bb71a22253a3" containerName="rabbitmq" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079332 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079339 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="sg-core" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079350 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079359 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="35854fe5-2e29-4a49-9783-873bee1058e2" containerName="barbican-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079367 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3ce4fd-4bba-4242-91ba-076cf3729770" containerName="keystone-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079377 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079385 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="241b1f65-5edb-4965-b9af-e8e12b73124c" containerName="openstack-network-exporter" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079396 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f48ef71-8ab0-4ed4-a58c-78046ec184b6" containerName="ovn-northd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079406 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d602476-cde4-435f-93bc-a72c137d1b58" containerName="barbican-keystone-listener" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079416 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3420c1f4-bf0d-4de6-90a4-c00e0722d911" containerName="placement-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079427 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079438 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba1c43c3-6e05-459e-9692-b8ddeb17e0e9" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079446 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="cinder-scheduler" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079455 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf4e23c-3df4-4d67-8a61-c97f860aa797" containerName="nova-scheduler-scheduler" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079467 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d11cec-09a9-4adc-9889-cc90f8b983e1" containerName="nova-metadata-metadata" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079476 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ce9853-109f-456d-b51c-b1d11072a90d" containerName="glance-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079486 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5180126-ac55-464c-90dd-565daffba54c" containerName="glance-httpd" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079497 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0da5e90-c960-4d67-9c19-6854f61dee14" containerName="nova-api-api" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079508 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0da7840-eaa9-46a7-bda6-5de928993572" containerName="cinder-api-log" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079517 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c95228-48ad-4e25-9cf7-bf0a2a1e4c69" containerName="barbican-worker" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079528 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-central-agent" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079538 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb8479c-8058-4a41-9bc7-8fd09bd321d8" containerName="kube-state-metrics" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079547 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="151eccad-6f76-476d-a2f4-53123f29bdb7" containerName="ceilometer-notification-agent" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079555 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" containerName="probe" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079564 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa25f12-05b2-4632-921a-d126059a63be" containerName="mariadb-account-delete" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079574 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15a2ac4-8be3-40a0-8f79-56c0ad8b034f" containerName="memcached" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.079585 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="419c8ee7-56fd-43cc-86de-7f647c708502" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.080981 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.089046 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2sw5"] Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.160960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-utilities\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.161040 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvbwc\" (UniqueName: \"kubernetes.io/projected/ca446713-8b21-4ad1-8068-3460c5dad716-kube-api-access-zvbwc\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.161080 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-catalog-content\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.262341 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-utilities\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.262425 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvbwc\" (UniqueName: \"kubernetes.io/projected/ca446713-8b21-4ad1-8068-3460c5dad716-kube-api-access-zvbwc\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.262454 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-catalog-content\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.262969 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-catalog-content\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.263251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-utilities\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.288597 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvbwc\" (UniqueName: \"kubernetes.io/projected/ca446713-8b21-4ad1-8068-3460c5dad716-kube-api-access-zvbwc\") pod \"redhat-marketplace-l2sw5\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.404360 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:23 crc kubenswrapper[4822]: I1010 06:47:23.882006 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2sw5"] Oct 10 06:47:24 crc kubenswrapper[4822]: I1010 06:47:24.520201 4822 generic.go:334] "Generic (PLEG): container finished" podID="ca446713-8b21-4ad1-8068-3460c5dad716" containerID="a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843" exitCode=0 Oct 10 06:47:24 crc kubenswrapper[4822]: I1010 06:47:24.520388 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2sw5" event={"ID":"ca446713-8b21-4ad1-8068-3460c5dad716","Type":"ContainerDied","Data":"a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843"} Oct 10 06:47:24 crc kubenswrapper[4822]: I1010 06:47:24.521370 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2sw5" event={"ID":"ca446713-8b21-4ad1-8068-3460c5dad716","Type":"ContainerStarted","Data":"556bf9fcde6adbf72faf16568fbbd1483ce0fffc2934161fa2d0103a9964f758"} Oct 10 06:47:24 crc kubenswrapper[4822]: I1010 06:47:24.526680 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 06:47:25 crc kubenswrapper[4822]: I1010 06:47:25.533905 4822 generic.go:334] "Generic (PLEG): container finished" podID="ca446713-8b21-4ad1-8068-3460c5dad716" containerID="2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e" exitCode=0 Oct 10 06:47:25 crc kubenswrapper[4822]: I1010 06:47:25.534300 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2sw5" event={"ID":"ca446713-8b21-4ad1-8068-3460c5dad716","Type":"ContainerDied","Data":"2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e"} Oct 10 06:47:26 crc kubenswrapper[4822]: I1010 06:47:26.544280 4822 generic.go:334] "Generic (PLEG): container finished" podID="7a076d47-5de3-4eba-a933-265448eb8a11" containerID="1dd0282c32b952bc28426c99bc939d6cd5556dd9df6f57d786cbccb734a3f9db" exitCode=0 Oct 10 06:47:26 crc kubenswrapper[4822]: I1010 06:47:26.544761 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7474d4d9-hl56q" event={"ID":"7a076d47-5de3-4eba-a933-265448eb8a11","Type":"ContainerDied","Data":"1dd0282c32b952bc28426c99bc939d6cd5556dd9df6f57d786cbccb734a3f9db"} Oct 10 06:47:26 crc kubenswrapper[4822]: I1010 06:47:26.549690 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2sw5" event={"ID":"ca446713-8b21-4ad1-8068-3460c5dad716","Type":"ContainerStarted","Data":"fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905"} Oct 10 06:47:26 crc kubenswrapper[4822]: I1010 06:47:26.569080 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2sw5" podStartSLOduration=2.089772428 podStartE2EDuration="3.569060843s" podCreationTimestamp="2025-10-10 06:47:23 +0000 UTC" firstStartedPulling="2025-10-10 06:47:24.52643375 +0000 UTC m=+1391.621591946" lastFinishedPulling="2025-10-10 06:47:26.005722135 +0000 UTC m=+1393.100880361" observedRunningTime="2025-10-10 06:47:26.56618511 +0000 UTC m=+1393.661343306" watchObservedRunningTime="2025-10-10 06:47:26.569060843 +0000 UTC m=+1393.664219039" Oct 10 06:47:26 crc kubenswrapper[4822]: I1010 06:47:26.884535 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057100 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-public-tls-certs\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057200 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-internal-tls-certs\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057241 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-combined-ca-bundle\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057292 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-httpd-config\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057318 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-config\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057348 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mb2n\" (UniqueName: \"kubernetes.io/projected/7a076d47-5de3-4eba-a933-265448eb8a11-kube-api-access-7mb2n\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.057438 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-ovndb-tls-certs\") pod \"7a076d47-5de3-4eba-a933-265448eb8a11\" (UID: \"7a076d47-5de3-4eba-a933-265448eb8a11\") " Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.064169 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a076d47-5de3-4eba-a933-265448eb8a11-kube-api-access-7mb2n" (OuterVolumeSpecName: "kube-api-access-7mb2n") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "kube-api-access-7mb2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.064418 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.096975 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.098679 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-config" (OuterVolumeSpecName: "config") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.101993 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.113486 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.138759 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7a076d47-5de3-4eba-a933-265448eb8a11" (UID: "7a076d47-5de3-4eba-a933-265448eb8a11"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.158905 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.158956 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.158968 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mb2n\" (UniqueName: \"kubernetes.io/projected/7a076d47-5de3-4eba-a933-265448eb8a11-kube-api-access-7mb2n\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.158981 4822 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.158993 4822 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.159004 4822 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.159014 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a076d47-5de3-4eba-a933-265448eb8a11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.562528 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c7474d4d9-hl56q" event={"ID":"7a076d47-5de3-4eba-a933-265448eb8a11","Type":"ContainerDied","Data":"9746516ae03a9d0431e081db90000cdd9ba8b028ceb5e12c38a4ccdc2127be36"} Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.562595 4822 scope.go:117] "RemoveContainer" containerID="a63c8d6a853e7bf676f1d100b90459d644b828358cb3a287d4e0e0565d941778" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.562545 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c7474d4d9-hl56q" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.593915 4822 scope.go:117] "RemoveContainer" containerID="1dd0282c32b952bc28426c99bc939d6cd5556dd9df6f57d786cbccb734a3f9db" Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.597421 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c7474d4d9-hl56q"] Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.605446 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c7474d4d9-hl56q"] Oct 10 06:47:27 crc kubenswrapper[4822]: I1010 06:47:27.657850 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" path="/var/lib/kubelet/pods/7a076d47-5de3-4eba-a933-265448eb8a11/volumes" Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.915044 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.915534 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.916009 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.916161 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.916360 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.919085 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.925698 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:27 crc kubenswrapper[4822]: E1010 06:47:27.925764 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.914468 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.915266 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.915535 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.915570 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.917281 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.919277 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.921447 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 06:47:32 crc kubenswrapper[4822]: E1010 06:47:32.921507 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-nrgt7" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.403580 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zmvls"] Oct 10 06:47:33 crc kubenswrapper[4822]: E1010 06:47:33.403944 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-api" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.403989 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-api" Oct 10 06:47:33 crc kubenswrapper[4822]: E1010 06:47:33.404009 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-httpd" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.404017 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-httpd" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.404169 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-api" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.404187 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a076d47-5de3-4eba-a933-265448eb8a11" containerName="neutron-httpd" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.405265 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.405554 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.405771 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.429604 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmvls"] Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.482493 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.561715 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-catalog-content\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.562057 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctv5f\" (UniqueName: \"kubernetes.io/projected/e9981e04-0357-4ea3-a845-0a8cc3eb476f-kube-api-access-ctv5f\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.562137 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-utilities\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.663726 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctv5f\" (UniqueName: \"kubernetes.io/projected/e9981e04-0357-4ea3-a845-0a8cc3eb476f-kube-api-access-ctv5f\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.664143 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-utilities\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.664222 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-catalog-content\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.664665 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-utilities\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.664731 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-catalog-content\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.680380 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.687859 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctv5f\" (UniqueName: \"kubernetes.io/projected/e9981e04-0357-4ea3-a845-0a8cc3eb476f-kube-api-access-ctv5f\") pod \"redhat-operators-zmvls\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:33 crc kubenswrapper[4822]: I1010 06:47:33.733259 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.185632 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmvls"] Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.216829 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nrgt7_b5adf58a-6071-48bd-8e95-2a664f10d551/ovs-vswitchd/0.log" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.217676 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.377885 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-log\") pod \"b5adf58a-6071-48bd-8e95-2a664f10d551\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378007 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-run\") pod \"b5adf58a-6071-48bd-8e95-2a664f10d551\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378077 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-etc-ovs\") pod \"b5adf58a-6071-48bd-8e95-2a664f10d551\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378110 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-log" (OuterVolumeSpecName: "var-log") pod "b5adf58a-6071-48bd-8e95-2a664f10d551" (UID: "b5adf58a-6071-48bd-8e95-2a664f10d551"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378134 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5adf58a-6071-48bd-8e95-2a664f10d551-scripts\") pod \"b5adf58a-6071-48bd-8e95-2a664f10d551\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378154 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-lib\") pod \"b5adf58a-6071-48bd-8e95-2a664f10d551\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378161 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-run" (OuterVolumeSpecName: "var-run") pod "b5adf58a-6071-48bd-8e95-2a664f10d551" (UID: "b5adf58a-6071-48bd-8e95-2a664f10d551"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378173 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxx94\" (UniqueName: \"kubernetes.io/projected/b5adf58a-6071-48bd-8e95-2a664f10d551-kube-api-access-lxx94\") pod \"b5adf58a-6071-48bd-8e95-2a664f10d551\" (UID: \"b5adf58a-6071-48bd-8e95-2a664f10d551\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378402 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "b5adf58a-6071-48bd-8e95-2a664f10d551" (UID: "b5adf58a-6071-48bd-8e95-2a664f10d551"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378440 4822 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378452 4822 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-log\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.378573 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-lib" (OuterVolumeSpecName: "var-lib") pod "b5adf58a-6071-48bd-8e95-2a664f10d551" (UID: "b5adf58a-6071-48bd-8e95-2a664f10d551"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.379667 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5adf58a-6071-48bd-8e95-2a664f10d551-scripts" (OuterVolumeSpecName: "scripts") pod "b5adf58a-6071-48bd-8e95-2a664f10d551" (UID: "b5adf58a-6071-48bd-8e95-2a664f10d551"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.387015 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5adf58a-6071-48bd-8e95-2a664f10d551-kube-api-access-lxx94" (OuterVolumeSpecName: "kube-api-access-lxx94") pod "b5adf58a-6071-48bd-8e95-2a664f10d551" (UID: "b5adf58a-6071-48bd-8e95-2a664f10d551"). InnerVolumeSpecName "kube-api-access-lxx94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.479312 4822 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.479345 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5adf58a-6071-48bd-8e95-2a664f10d551-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.479354 4822 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5adf58a-6071-48bd-8e95-2a664f10d551-var-lib\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.479364 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxx94\" (UniqueName: \"kubernetes.io/projected/b5adf58a-6071-48bd-8e95-2a664f10d551-kube-api-access-lxx94\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.539789 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 06:47:34 crc kubenswrapper[4822]: E1010 06:47:34.580817 4822 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Oct 10 06:47:34 crc kubenswrapper[4822]: E1010 06:47:34.580869 4822 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Oct 10 06:47:34 crc kubenswrapper[4822]: E1010 06:47:34.580883 4822 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 06:47:34 crc kubenswrapper[4822]: E1010 06:47:34.580899 4822 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:34 crc kubenswrapper[4822]: E1010 06:47:34.580978 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift podName:21826954-a4ea-4715-be9e-6cd8272342dc nodeName:}" failed. No retries permitted until 2025-10-10 06:48:06.580952123 +0000 UTC m=+1433.676110329 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift") pod "swift-storage-0" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.632482 4822 generic.go:334] "Generic (PLEG): container finished" podID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerID="0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f" exitCode=0 Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.632597 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerDied","Data":"0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f"} Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.632632 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerStarted","Data":"c84bc24ae05a5e09eb4928267d75b3b145454467ccde50262afc66690c24946d"} Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.641123 4822 generic.go:334] "Generic (PLEG): container finished" podID="21826954-a4ea-4715-be9e-6cd8272342dc" containerID="c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e" exitCode=137 Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.641200 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.641181 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e"} Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.641308 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21826954-a4ea-4715-be9e-6cd8272342dc","Type":"ContainerDied","Data":"2aa154c46a6bc1c78d7a6b89025950c4f006431b7f75cd5bd318e74cb583be51"} Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.641332 4822 scope.go:117] "RemoveContainer" containerID="c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.644270 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nrgt7_b5adf58a-6071-48bd-8e95-2a664f10d551/ovs-vswitchd/0.log" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.645027 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" exitCode=137 Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.645099 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nrgt7" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.645152 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerDied","Data":"84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb"} Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.645205 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nrgt7" event={"ID":"b5adf58a-6071-48bd-8e95-2a664f10d551","Type":"ContainerDied","Data":"285ad830049d2a7a50d439eb2ade0783f668170a94ba668dbb96ae44b83ddd41"} Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.668481 4822 scope.go:117] "RemoveContainer" containerID="67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.682779 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p85j\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-kube-api-access-4p85j\") pod \"21826954-a4ea-4715-be9e-6cd8272342dc\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.682940 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-cache\") pod \"21826954-a4ea-4715-be9e-6cd8272342dc\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.682980 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"21826954-a4ea-4715-be9e-6cd8272342dc\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.683060 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-lock\") pod \"21826954-a4ea-4715-be9e-6cd8272342dc\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.683140 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") pod \"21826954-a4ea-4715-be9e-6cd8272342dc\" (UID: \"21826954-a4ea-4715-be9e-6cd8272342dc\") " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.686648 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-lock" (OuterVolumeSpecName: "lock") pod "21826954-a4ea-4715-be9e-6cd8272342dc" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.686694 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-cache" (OuterVolumeSpecName: "cache") pod "21826954-a4ea-4715-be9e-6cd8272342dc" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.687693 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-nrgt7"] Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.688827 4822 scope.go:117] "RemoveContainer" containerID="13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.692019 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "21826954-a4ea-4715-be9e-6cd8272342dc" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.694351 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-nrgt7"] Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.694438 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-kube-api-access-4p85j" (OuterVolumeSpecName: "kube-api-access-4p85j") pod "21826954-a4ea-4715-be9e-6cd8272342dc" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc"). InnerVolumeSpecName "kube-api-access-4p85j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.694548 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21826954-a4ea-4715-be9e-6cd8272342dc" (UID: "21826954-a4ea-4715-be9e-6cd8272342dc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.712772 4822 scope.go:117] "RemoveContainer" containerID="11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.730033 4822 scope.go:117] "RemoveContainer" containerID="cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.781954 4822 scope.go:117] "RemoveContainer" containerID="e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.785255 4822 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-lock\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.785280 4822 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.785289 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p85j\" (UniqueName: \"kubernetes.io/projected/21826954-a4ea-4715-be9e-6cd8272342dc-kube-api-access-4p85j\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.785298 4822 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21826954-a4ea-4715-be9e-6cd8272342dc-cache\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.785316 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.800105 4822 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.801921 4822 scope.go:117] "RemoveContainer" containerID="0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.823509 4822 scope.go:117] "RemoveContainer" containerID="b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.844352 4822 scope.go:117] "RemoveContainer" containerID="7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.866924 4822 scope.go:117] "RemoveContainer" containerID="4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.887267 4822 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.890690 4822 scope.go:117] "RemoveContainer" containerID="04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.917789 4822 scope.go:117] "RemoveContainer" containerID="ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.944825 4822 scope.go:117] "RemoveContainer" containerID="19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.979460 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.982937 4822 scope.go:117] "RemoveContainer" containerID="3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c" Oct 10 06:47:34 crc kubenswrapper[4822]: I1010 06:47:34.984893 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:34.999964 4822 scope.go:117] "RemoveContainer" containerID="e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.031069 4822 scope.go:117] "RemoveContainer" containerID="c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.031514 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e\": container with ID starting with c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e not found: ID does not exist" containerID="c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.031563 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e"} err="failed to get container status \"c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e\": rpc error: code = NotFound desc = could not find container \"c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e\": container with ID starting with c5672af3fea9334998677576c1a3f2b18b28814ff4a7c319eb33db0e321c253e not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.031589 4822 scope.go:117] "RemoveContainer" containerID="67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.032128 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822\": container with ID starting with 67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822 not found: ID does not exist" containerID="67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.032177 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822"} err="failed to get container status \"67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822\": rpc error: code = NotFound desc = could not find container \"67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822\": container with ID starting with 67288d8130668d1ffde69a50dac06a21a6a399064d4c92601c0d7fbaf2541822 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.032211 4822 scope.go:117] "RemoveContainer" containerID="13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.032533 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d\": container with ID starting with 13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d not found: ID does not exist" containerID="13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.032581 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d"} err="failed to get container status \"13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d\": rpc error: code = NotFound desc = could not find container \"13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d\": container with ID starting with 13b04e51c5e78e5efff6657eb74a1f15c70df090ed277fde1fac61353f5d7d2d not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.032611 4822 scope.go:117] "RemoveContainer" containerID="11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.033047 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9\": container with ID starting with 11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9 not found: ID does not exist" containerID="11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.033082 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9"} err="failed to get container status \"11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9\": rpc error: code = NotFound desc = could not find container \"11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9\": container with ID starting with 11fe3de564ed8183bea182b722bf135fac17bfe368d354e9f3dd8c651e8996c9 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.033100 4822 scope.go:117] "RemoveContainer" containerID="cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.033420 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4\": container with ID starting with cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4 not found: ID does not exist" containerID="cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.033450 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4"} err="failed to get container status \"cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4\": rpc error: code = NotFound desc = could not find container \"cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4\": container with ID starting with cb9a7181e21c0138c83de02d11547c82c7b5b5f15ebc45a5186e8805e73d97d4 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.033470 4822 scope.go:117] "RemoveContainer" containerID="e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.033751 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12\": container with ID starting with e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12 not found: ID does not exist" containerID="e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.033876 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12"} err="failed to get container status \"e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12\": rpc error: code = NotFound desc = could not find container \"e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12\": container with ID starting with e400ab3fef8849a522debf12e7fe38a03fd5f5fd2d62c207cbc09104a185bf12 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.033910 4822 scope.go:117] "RemoveContainer" containerID="0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.034305 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8\": container with ID starting with 0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8 not found: ID does not exist" containerID="0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.034330 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8"} err="failed to get container status \"0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8\": rpc error: code = NotFound desc = could not find container \"0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8\": container with ID starting with 0b6211c3b0822e63be202243486bf42d1bf009dc7a312a64e0fc9326487cfea8 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.034343 4822 scope.go:117] "RemoveContainer" containerID="b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.034596 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd\": container with ID starting with b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd not found: ID does not exist" containerID="b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.034693 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd"} err="failed to get container status \"b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd\": rpc error: code = NotFound desc = could not find container \"b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd\": container with ID starting with b47c1ed31ade1e1187e245eb0d600917fd292ede0eb65291bcf076d5512e5acd not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.034747 4822 scope.go:117] "RemoveContainer" containerID="7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.035155 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664\": container with ID starting with 7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664 not found: ID does not exist" containerID="7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.035176 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664"} err="failed to get container status \"7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664\": rpc error: code = NotFound desc = could not find container \"7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664\": container with ID starting with 7448306a3ee1134295a1c6a2dfb3ab9e60def132bafd619a3668c2be9eda4664 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.035189 4822 scope.go:117] "RemoveContainer" containerID="4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.036074 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e\": container with ID starting with 4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e not found: ID does not exist" containerID="4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.036108 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e"} err="failed to get container status \"4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e\": rpc error: code = NotFound desc = could not find container \"4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e\": container with ID starting with 4a212be009a065a1d5b41175812032eac23ed4322679ca7178bd60c223bd9d5e not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.036132 4822 scope.go:117] "RemoveContainer" containerID="04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.036482 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66\": container with ID starting with 04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66 not found: ID does not exist" containerID="04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.036525 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66"} err="failed to get container status \"04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66\": rpc error: code = NotFound desc = could not find container \"04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66\": container with ID starting with 04d1545848c3370aa656d4c54c5bb6d65e4c880ad7fdb4d3d82b9e403884db66 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.036552 4822 scope.go:117] "RemoveContainer" containerID="ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.037028 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768\": container with ID starting with ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768 not found: ID does not exist" containerID="ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.037048 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768"} err="failed to get container status \"ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768\": rpc error: code = NotFound desc = could not find container \"ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768\": container with ID starting with ff03a8201ca4853718d660a98579a6471f478353d2b536c8dcd8169dac443768 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.037061 4822 scope.go:117] "RemoveContainer" containerID="19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.037387 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc\": container with ID starting with 19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc not found: ID does not exist" containerID="19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.037428 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc"} err="failed to get container status \"19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc\": rpc error: code = NotFound desc = could not find container \"19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc\": container with ID starting with 19793e48f3cdc2aab5a339fe7db2a6da9e2131498df7ed33243055e80a8dc7dc not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.037466 4822 scope.go:117] "RemoveContainer" containerID="3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.037745 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c\": container with ID starting with 3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c not found: ID does not exist" containerID="3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.037766 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c"} err="failed to get container status \"3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c\": rpc error: code = NotFound desc = could not find container \"3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c\": container with ID starting with 3fe9ea3ea1d3a176522c1b3d5c595a911e821153c607cd64a497c71b4586612c not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.037778 4822 scope.go:117] "RemoveContainer" containerID="e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.038018 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3\": container with ID starting with e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3 not found: ID does not exist" containerID="e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.038036 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3"} err="failed to get container status \"e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3\": rpc error: code = NotFound desc = could not find container \"e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3\": container with ID starting with e290ed26b6eb3c76a53bdcffb5cd8c8da3ee44dbffc71ad149bf2b48d403ddb3 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.038047 4822 scope.go:117] "RemoveContainer" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.056381 4822 scope.go:117] "RemoveContainer" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.075073 4822 scope.go:117] "RemoveContainer" containerID="074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.101055 4822 scope.go:117] "RemoveContainer" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.101549 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb\": container with ID starting with 84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb not found: ID does not exist" containerID="84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.101579 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb"} err="failed to get container status \"84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb\": rpc error: code = NotFound desc = could not find container \"84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb\": container with ID starting with 84b20d957fc4de4c10cbee050a68da2831a1ffd00159322f63ebdc3b394d24bb not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.101598 4822 scope.go:117] "RemoveContainer" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.102009 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81\": container with ID starting with 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 not found: ID does not exist" containerID="3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.102042 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81"} err="failed to get container status \"3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81\": rpc error: code = NotFound desc = could not find container \"3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81\": container with ID starting with 3fe1465372403ba455260ae73c963b7d427a36a06a35970b800d6aff713f3e81 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.102057 4822 scope.go:117] "RemoveContainer" containerID="074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.102318 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992\": container with ID starting with 074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992 not found: ID does not exist" containerID="074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.102343 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992"} err="failed to get container status \"074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992\": rpc error: code = NotFound desc = could not find container \"074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992\": container with ID starting with 074d0c394d2a7b41fe17f9cd3eb4685fc4e55a39c2c29b89856ea315abd8b992 not found: ID does not exist" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.663493 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" path="/var/lib/kubelet/pods/21826954-a4ea-4715-be9e-6cd8272342dc/volumes" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.665855 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" path="/var/lib/kubelet/pods/b5adf58a-6071-48bd-8e95-2a664f10d551/volumes" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.666532 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerStarted","Data":"b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a"} Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.791551 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tgqvc"] Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.791926 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-reaper" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.791941 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-reaper" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.791954 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.791961 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.791971 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.791978 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.791988 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.791995 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-server" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792010 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-updater" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792017 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-updater" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792034 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792041 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792051 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792058 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792069 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792076 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-server" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792084 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792089 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792099 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792105 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792113 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-updater" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792119 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-updater" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792129 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792134 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792141 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792146 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-server" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792158 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="rsync" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792164 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="rsync" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792175 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792181 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792188 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-expirer" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792194 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-expirer" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792204 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server-init" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792210 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server-init" Oct 10 06:47:35 crc kubenswrapper[4822]: E1010 06:47:35.792222 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="swift-recon-cron" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792228 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="swift-recon-cron" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792400 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="rsync" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792422 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792431 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792441 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792454 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792466 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovsdb-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792477 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792487 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-auditor" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792502 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5adf58a-6071-48bd-8e95-2a664f10d551" containerName="ovs-vswitchd" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792512 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-updater" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792523 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-expirer" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792532 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792542 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-server" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792554 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="account-reaper" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792562 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="container-replicator" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792575 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="object-updater" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.792585 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="21826954-a4ea-4715-be9e-6cd8272342dc" containerName="swift-recon-cron" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.793832 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.810606 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgqvc"] Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.900070 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-utilities\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.900158 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkpw\" (UniqueName: \"kubernetes.io/projected/b2fb9277-764a-45d8-b2ec-f662b12ecc33-kube-api-access-xlkpw\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.900191 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-catalog-content\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:35 crc kubenswrapper[4822]: I1010 06:47:35.984651 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2sw5"] Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.002368 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlkpw\" (UniqueName: \"kubernetes.io/projected/b2fb9277-764a-45d8-b2ec-f662b12ecc33-kube-api-access-xlkpw\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.002420 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-catalog-content\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.002494 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-utilities\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.003065 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-catalog-content\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.003110 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-utilities\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.023078 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlkpw\" (UniqueName: \"kubernetes.io/projected/b2fb9277-764a-45d8-b2ec-f662b12ecc33-kube-api-access-xlkpw\") pod \"certified-operators-tgqvc\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.118132 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.594178 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgqvc"] Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.673463 4822 generic.go:334] "Generic (PLEG): container finished" podID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerID="b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a" exitCode=0 Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.673567 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerDied","Data":"b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a"} Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.677557 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgqvc" event={"ID":"b2fb9277-764a-45d8-b2ec-f662b12ecc33","Type":"ContainerStarted","Data":"02cae38e4b31eef85238fc8ae7a3cc28ae40f653911b58d0f947f394702b37cd"} Oct 10 06:47:36 crc kubenswrapper[4822]: I1010 06:47:36.677825 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l2sw5" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="registry-server" containerID="cri-o://fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905" gracePeriod=2 Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.102005 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.220986 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-catalog-content\") pod \"ca446713-8b21-4ad1-8068-3460c5dad716\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.221452 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-utilities\") pod \"ca446713-8b21-4ad1-8068-3460c5dad716\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.221536 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvbwc\" (UniqueName: \"kubernetes.io/projected/ca446713-8b21-4ad1-8068-3460c5dad716-kube-api-access-zvbwc\") pod \"ca446713-8b21-4ad1-8068-3460c5dad716\" (UID: \"ca446713-8b21-4ad1-8068-3460c5dad716\") " Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.223021 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-utilities" (OuterVolumeSpecName: "utilities") pod "ca446713-8b21-4ad1-8068-3460c5dad716" (UID: "ca446713-8b21-4ad1-8068-3460c5dad716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.229852 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca446713-8b21-4ad1-8068-3460c5dad716-kube-api-access-zvbwc" (OuterVolumeSpecName: "kube-api-access-zvbwc") pod "ca446713-8b21-4ad1-8068-3460c5dad716" (UID: "ca446713-8b21-4ad1-8068-3460c5dad716"). InnerVolumeSpecName "kube-api-access-zvbwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.240791 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca446713-8b21-4ad1-8068-3460c5dad716" (UID: "ca446713-8b21-4ad1-8068-3460c5dad716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.322741 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvbwc\" (UniqueName: \"kubernetes.io/projected/ca446713-8b21-4ad1-8068-3460c5dad716-kube-api-access-zvbwc\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.322847 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.322858 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca446713-8b21-4ad1-8068-3460c5dad716-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.687609 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerStarted","Data":"eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce"} Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.691333 4822 generic.go:334] "Generic (PLEG): container finished" podID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerID="8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648" exitCode=0 Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.691411 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgqvc" event={"ID":"b2fb9277-764a-45d8-b2ec-f662b12ecc33","Type":"ContainerDied","Data":"8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648"} Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.696388 4822 generic.go:334] "Generic (PLEG): container finished" podID="ca446713-8b21-4ad1-8068-3460c5dad716" containerID="fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905" exitCode=0 Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.696433 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2sw5" event={"ID":"ca446713-8b21-4ad1-8068-3460c5dad716","Type":"ContainerDied","Data":"fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905"} Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.696486 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2sw5" event={"ID":"ca446713-8b21-4ad1-8068-3460c5dad716","Type":"ContainerDied","Data":"556bf9fcde6adbf72faf16568fbbd1483ce0fffc2934161fa2d0103a9964f758"} Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.696546 4822 scope.go:117] "RemoveContainer" containerID="fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.696838 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2sw5" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.717376 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zmvls" podStartSLOduration=1.871897132 podStartE2EDuration="4.717353291s" podCreationTimestamp="2025-10-10 06:47:33 +0000 UTC" firstStartedPulling="2025-10-10 06:47:34.634160005 +0000 UTC m=+1401.729318201" lastFinishedPulling="2025-10-10 06:47:37.479616144 +0000 UTC m=+1404.574774360" observedRunningTime="2025-10-10 06:47:37.711566175 +0000 UTC m=+1404.806724391" watchObservedRunningTime="2025-10-10 06:47:37.717353291 +0000 UTC m=+1404.812511487" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.724639 4822 scope.go:117] "RemoveContainer" containerID="2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.753924 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2sw5"] Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.754257 4822 scope.go:117] "RemoveContainer" containerID="a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.759604 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2sw5"] Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.775155 4822 scope.go:117] "RemoveContainer" containerID="fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905" Oct 10 06:47:37 crc kubenswrapper[4822]: E1010 06:47:37.775601 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905\": container with ID starting with fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905 not found: ID does not exist" containerID="fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.775649 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905"} err="failed to get container status \"fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905\": rpc error: code = NotFound desc = could not find container \"fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905\": container with ID starting with fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905 not found: ID does not exist" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.775675 4822 scope.go:117] "RemoveContainer" containerID="2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e" Oct 10 06:47:37 crc kubenswrapper[4822]: E1010 06:47:37.776494 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e\": container with ID starting with 2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e not found: ID does not exist" containerID="2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.776539 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e"} err="failed to get container status \"2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e\": rpc error: code = NotFound desc = could not find container \"2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e\": container with ID starting with 2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e not found: ID does not exist" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.776597 4822 scope.go:117] "RemoveContainer" containerID="a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843" Oct 10 06:47:37 crc kubenswrapper[4822]: E1010 06:47:37.777097 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843\": container with ID starting with a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843 not found: ID does not exist" containerID="a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.777128 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843"} err="failed to get container status \"a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843\": rpc error: code = NotFound desc = could not find container \"a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843\": container with ID starting with a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843 not found: ID does not exist" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.863745 4822 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7d3cbb68-b8c5-44fa-bd93-f70ad796d01a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d3cbb68-b8c5-44fa-bd93-f70ad796d01a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d3cbb68_b8c5_44fa_bd93_f70ad796d01a.slice" Oct 10 06:47:37 crc kubenswrapper[4822]: E1010 06:47:37.863790 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7d3cbb68-b8c5-44fa-bd93-f70ad796d01a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d3cbb68-b8c5-44fa-bd93-f70ad796d01a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d3cbb68_b8c5_44fa_bd93_f70ad796d01a.slice" pod="openstack/cinder-scheduler-0" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" Oct 10 06:47:37 crc kubenswrapper[4822]: I1010 06:47:37.865861 4822 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod0df354f0-e9e8-441a-a676-8a6468b8c191"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod0df354f0-e9e8-441a-a676-8a6468b8c191] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0df354f0_e9e8_441a_a676_8a6468b8c191.slice" Oct 10 06:47:37 crc kubenswrapper[4822]: E1010 06:47:37.865911 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod0df354f0-e9e8-441a-a676-8a6468b8c191] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod0df354f0-e9e8-441a-a676-8a6468b8c191] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0df354f0_e9e8_441a_a676_8a6468b8c191.slice" pod="openstack/swift-proxy-6b5485c95f-w8q56" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.719639 4822 generic.go:334] "Generic (PLEG): container finished" podID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerID="7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f" exitCode=0 Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.719701 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgqvc" event={"ID":"b2fb9277-764a-45d8-b2ec-f662b12ecc33","Type":"ContainerDied","Data":"7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f"} Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.721369 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.721378 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b5485c95f-w8q56" Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.781740 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6b5485c95f-w8q56"] Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.788672 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6b5485c95f-w8q56"] Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.803987 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:47:38 crc kubenswrapper[4822]: I1010 06:47:38.814317 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.250074 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.380708 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnwpt\" (UniqueName: \"kubernetes.io/projected/52264dc7-4118-484f-ab12-1bfd17172c20-kube-api-access-tnwpt\") pod \"52264dc7-4118-484f-ab12-1bfd17172c20\" (UID: \"52264dc7-4118-484f-ab12-1bfd17172c20\") " Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.387145 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52264dc7-4118-484f-ab12-1bfd17172c20-kube-api-access-tnwpt" (OuterVolumeSpecName: "kube-api-access-tnwpt") pod "52264dc7-4118-484f-ab12-1bfd17172c20" (UID: "52264dc7-4118-484f-ab12-1bfd17172c20"). InnerVolumeSpecName "kube-api-access-tnwpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.415845 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.421650 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.482311 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnwpt\" (UniqueName: \"kubernetes.io/projected/52264dc7-4118-484f-ab12-1bfd17172c20-kube-api-access-tnwpt\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.583219 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69zwd\" (UniqueName: \"kubernetes.io/projected/dc72727a-70e5-402e-90f1-2c54c48dd5f8-kube-api-access-69zwd\") pod \"dc72727a-70e5-402e-90f1-2c54c48dd5f8\" (UID: \"dc72727a-70e5-402e-90f1-2c54c48dd5f8\") " Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.583359 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwbsf\" (UniqueName: \"kubernetes.io/projected/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86-kube-api-access-fwbsf\") pod \"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86\" (UID: \"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86\") " Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.586260 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc72727a-70e5-402e-90f1-2c54c48dd5f8-kube-api-access-69zwd" (OuterVolumeSpecName: "kube-api-access-69zwd") pod "dc72727a-70e5-402e-90f1-2c54c48dd5f8" (UID: "dc72727a-70e5-402e-90f1-2c54c48dd5f8"). InnerVolumeSpecName "kube-api-access-69zwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.587514 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86-kube-api-access-fwbsf" (OuterVolumeSpecName: "kube-api-access-fwbsf") pod "caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" (UID: "caeaf30d-9a5b-423c-bf9a-eaaa9351ec86"). InnerVolumeSpecName "kube-api-access-fwbsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.659442 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df354f0-e9e8-441a-a676-8a6468b8c191" path="/var/lib/kubelet/pods/0df354f0-e9e8-441a-a676-8a6468b8c191/volumes" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.660051 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3cbb68-b8c5-44fa-bd93-f70ad796d01a" path="/var/lib/kubelet/pods/7d3cbb68-b8c5-44fa-bd93-f70ad796d01a/volumes" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.660759 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" path="/var/lib/kubelet/pods/ca446713-8b21-4ad1-8068-3460c5dad716/volumes" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.685955 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwbsf\" (UniqueName: \"kubernetes.io/projected/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86-kube-api-access-fwbsf\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.685989 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69zwd\" (UniqueName: \"kubernetes.io/projected/dc72727a-70e5-402e-90f1-2c54c48dd5f8-kube-api-access-69zwd\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.730552 4822 generic.go:334] "Generic (PLEG): container finished" podID="caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" containerID="5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b" exitCode=137 Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.730647 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07142-account-delete-jddpj" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.730646 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07142-account-delete-jddpj" event={"ID":"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86","Type":"ContainerDied","Data":"5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.730703 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07142-account-delete-jddpj" event={"ID":"caeaf30d-9a5b-423c-bf9a-eaaa9351ec86","Type":"ContainerDied","Data":"b9116d6ea01101ec61e3dfe7b9540f5bc7b5c158af7c6f45265e7380b9e6c2ec"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.730743 4822 scope.go:117] "RemoveContainer" containerID="5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.732643 4822 generic.go:334] "Generic (PLEG): container finished" podID="52264dc7-4118-484f-ab12-1bfd17172c20" containerID="877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37" exitCode=137 Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.732697 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron834d-account-delete-j5ft4" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.732720 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron834d-account-delete-j5ft4" event={"ID":"52264dc7-4118-484f-ab12-1bfd17172c20","Type":"ContainerDied","Data":"877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.732758 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron834d-account-delete-j5ft4" event={"ID":"52264dc7-4118-484f-ab12-1bfd17172c20","Type":"ContainerDied","Data":"50d159aac72227ee1187b2a0cd8cff14aa3adefd530efa833248f31eee9c9e76"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.736589 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgqvc" event={"ID":"b2fb9277-764a-45d8-b2ec-f662b12ecc33","Type":"ContainerStarted","Data":"6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.738313 4822 generic.go:334] "Generic (PLEG): container finished" podID="dc72727a-70e5-402e-90f1-2c54c48dd5f8" containerID="cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa" exitCode=137 Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.738361 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder5f70-account-delete-kxtp9" event={"ID":"dc72727a-70e5-402e-90f1-2c54c48dd5f8","Type":"ContainerDied","Data":"cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.738374 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder5f70-account-delete-kxtp9" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.738391 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder5f70-account-delete-kxtp9" event={"ID":"dc72727a-70e5-402e-90f1-2c54c48dd5f8","Type":"ContainerDied","Data":"d728381013137c6b810bd10f751a79667e9e1b3ec2cdbaf8b5ee408b4ec28624"} Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.753761 4822 scope.go:117] "RemoveContainer" containerID="5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b" Oct 10 06:47:39 crc kubenswrapper[4822]: E1010 06:47:39.754485 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b\": container with ID starting with 5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b not found: ID does not exist" containerID="5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.754538 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b"} err="failed to get container status \"5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b\": rpc error: code = NotFound desc = could not find container \"5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b\": container with ID starting with 5cdc444179bf7bbf088623c7e58bf94e9395cce555774da0a437689d26867c6b not found: ID does not exist" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.754571 4822 scope.go:117] "RemoveContainer" containerID="877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.760070 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell07142-account-delete-jddpj"] Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.772790 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell07142-account-delete-jddpj"] Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.774284 4822 scope.go:117] "RemoveContainer" containerID="877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37" Oct 10 06:47:39 crc kubenswrapper[4822]: E1010 06:47:39.774769 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37\": container with ID starting with 877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37 not found: ID does not exist" containerID="877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.774823 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37"} err="failed to get container status \"877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37\": rpc error: code = NotFound desc = could not find container \"877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37\": container with ID starting with 877b85517c88b0c41292de6c6a8e949adbeea87e12068ea5a0a542249a61bb37 not found: ID does not exist" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.774852 4822 scope.go:117] "RemoveContainer" containerID="cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.782414 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron834d-account-delete-j5ft4"] Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.789582 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron834d-account-delete-j5ft4"] Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.795253 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tgqvc" podStartSLOduration=3.33658158 podStartE2EDuration="4.79523215s" podCreationTimestamp="2025-10-10 06:47:35 +0000 UTC" firstStartedPulling="2025-10-10 06:47:37.693638588 +0000 UTC m=+1404.788796784" lastFinishedPulling="2025-10-10 06:47:39.152289158 +0000 UTC m=+1406.247447354" observedRunningTime="2025-10-10 06:47:39.791335678 +0000 UTC m=+1406.886493884" watchObservedRunningTime="2025-10-10 06:47:39.79523215 +0000 UTC m=+1406.890390346" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.795918 4822 scope.go:117] "RemoveContainer" containerID="cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa" Oct 10 06:47:39 crc kubenswrapper[4822]: E1010 06:47:39.796373 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa\": container with ID starting with cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa not found: ID does not exist" containerID="cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.797063 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa"} err="failed to get container status \"cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa\": rpc error: code = NotFound desc = could not find container \"cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa\": container with ID starting with cb768949eb49e8fd61e3d04c1d980a922ddb4d715b2594609554ae497a4f0ffa not found: ID does not exist" Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.809242 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder5f70-account-delete-kxtp9"] Oct 10 06:47:39 crc kubenswrapper[4822]: I1010 06:47:39.814054 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder5f70-account-delete-kxtp9"] Oct 10 06:47:40 crc kubenswrapper[4822]: W1010 06:47:40.205580 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca446713_8b21_4ad1_8068_3460c5dad716.slice/crio-a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843.scope WatchSource:0}: Error finding container a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843: Status 404 returned error can't find the container with id a2ef6d4b718db6e1619ffc05eda37fece6c2d1c1fa5a4e453ae66b5a0ac5a843 Oct 10 06:47:40 crc kubenswrapper[4822]: W1010 06:47:40.206627 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca446713_8b21_4ad1_8068_3460c5dad716.slice/crio-2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e.scope WatchSource:0}: Error finding container 2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e: Status 404 returned error can't find the container with id 2c26f7dbb4e3e9ed4c0ccd31ca27f64417d97cefbe9e6055c3c29bc060c2614e Oct 10 06:47:40 crc kubenswrapper[4822]: W1010 06:47:40.208542 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca446713_8b21_4ad1_8068_3460c5dad716.slice/crio-fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905.scope WatchSource:0}: Error finding container fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905: Status 404 returned error can't find the container with id fc943f8a775d706460f9e3e86d9a5bf10991cc32565f2686f39b84951df1a905 Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.532111 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.700836 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjqz\" (UniqueName: \"kubernetes.io/projected/c07072cb-ae19-4dcb-9f52-432fe923949d-kube-api-access-dtjqz\") pod \"c07072cb-ae19-4dcb-9f52-432fe923949d\" (UID: \"c07072cb-ae19-4dcb-9f52-432fe923949d\") " Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.713037 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07072cb-ae19-4dcb-9f52-432fe923949d-kube-api-access-dtjqz" (OuterVolumeSpecName: "kube-api-access-dtjqz") pod "c07072cb-ae19-4dcb-9f52-432fe923949d" (UID: "c07072cb-ae19-4dcb-9f52-432fe923949d"). InnerVolumeSpecName "kube-api-access-dtjqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.747016 4822 generic.go:334] "Generic (PLEG): container finished" podID="c07072cb-ae19-4dcb-9f52-432fe923949d" containerID="95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75" exitCode=137 Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.747069 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9441-account-delete-zxr9g" event={"ID":"c07072cb-ae19-4dcb-9f52-432fe923949d","Type":"ContainerDied","Data":"95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75"} Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.747096 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9441-account-delete-zxr9g" event={"ID":"c07072cb-ae19-4dcb-9f52-432fe923949d","Type":"ContainerDied","Data":"53dbd27625668b4d44abfc598493fb4a5c6ececdfbfff7c832320fb49ece7bf8"} Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.747112 4822 scope.go:117] "RemoveContainer" containerID="95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75" Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.747201 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9441-account-delete-zxr9g" Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.769920 4822 scope.go:117] "RemoveContainer" containerID="95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75" Oct 10 06:47:40 crc kubenswrapper[4822]: E1010 06:47:40.770341 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75\": container with ID starting with 95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75 not found: ID does not exist" containerID="95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75" Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.770434 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75"} err="failed to get container status \"95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75\": rpc error: code = NotFound desc = could not find container \"95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75\": container with ID starting with 95946dad52335500410f26c22364c959164e21c7231be3affda81394fcd77d75 not found: ID does not exist" Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.777904 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican9441-account-delete-zxr9g"] Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.783603 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican9441-account-delete-zxr9g"] Oct 10 06:47:40 crc kubenswrapper[4822]: I1010 06:47:40.803194 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjqz\" (UniqueName: \"kubernetes.io/projected/c07072cb-ae19-4dcb-9f52-432fe923949d-kube-api-access-dtjqz\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:41 crc kubenswrapper[4822]: I1010 06:47:41.668186 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52264dc7-4118-484f-ab12-1bfd17172c20" path="/var/lib/kubelet/pods/52264dc7-4118-484f-ab12-1bfd17172c20/volumes" Oct 10 06:47:41 crc kubenswrapper[4822]: I1010 06:47:41.669274 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07072cb-ae19-4dcb-9f52-432fe923949d" path="/var/lib/kubelet/pods/c07072cb-ae19-4dcb-9f52-432fe923949d/volumes" Oct 10 06:47:41 crc kubenswrapper[4822]: I1010 06:47:41.670315 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" path="/var/lib/kubelet/pods/caeaf30d-9a5b-423c-bf9a-eaaa9351ec86/volumes" Oct 10 06:47:41 crc kubenswrapper[4822]: I1010 06:47:41.671286 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc72727a-70e5-402e-90f1-2c54c48dd5f8" path="/var/lib/kubelet/pods/dc72727a-70e5-402e-90f1-2c54c48dd5f8/volumes" Oct 10 06:47:43 crc kubenswrapper[4822]: I1010 06:47:43.734736 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:43 crc kubenswrapper[4822]: I1010 06:47:43.735100 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:43 crc kubenswrapper[4822]: I1010 06:47:43.777464 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:43 crc kubenswrapper[4822]: I1010 06:47:43.824524 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:44 crc kubenswrapper[4822]: I1010 06:47:44.186264 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmvls"] Oct 10 06:47:45 crc kubenswrapper[4822]: I1010 06:47:45.802916 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zmvls" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="registry-server" containerID="cri-o://eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce" gracePeriod=2 Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.118578 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.118654 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.162233 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.384004 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.486414 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctv5f\" (UniqueName: \"kubernetes.io/projected/e9981e04-0357-4ea3-a845-0a8cc3eb476f-kube-api-access-ctv5f\") pod \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.486832 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-catalog-content\") pod \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.486909 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-utilities\") pod \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\" (UID: \"e9981e04-0357-4ea3-a845-0a8cc3eb476f\") " Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.489357 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-utilities" (OuterVolumeSpecName: "utilities") pod "e9981e04-0357-4ea3-a845-0a8cc3eb476f" (UID: "e9981e04-0357-4ea3-a845-0a8cc3eb476f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.494998 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9981e04-0357-4ea3-a845-0a8cc3eb476f-kube-api-access-ctv5f" (OuterVolumeSpecName: "kube-api-access-ctv5f") pod "e9981e04-0357-4ea3-a845-0a8cc3eb476f" (UID: "e9981e04-0357-4ea3-a845-0a8cc3eb476f"). InnerVolumeSpecName "kube-api-access-ctv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.588725 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctv5f\" (UniqueName: \"kubernetes.io/projected/e9981e04-0357-4ea3-a845-0a8cc3eb476f-kube-api-access-ctv5f\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.588754 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.818141 4822 generic.go:334] "Generic (PLEG): container finished" podID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerID="eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce" exitCode=0 Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.819098 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvls" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.819216 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerDied","Data":"eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce"} Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.819330 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvls" event={"ID":"e9981e04-0357-4ea3-a845-0a8cc3eb476f","Type":"ContainerDied","Data":"c84bc24ae05a5e09eb4928267d75b3b145454467ccde50262afc66690c24946d"} Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.819428 4822 scope.go:117] "RemoveContainer" containerID="eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.846094 4822 scope.go:117] "RemoveContainer" containerID="b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.870051 4822 scope.go:117] "RemoveContainer" containerID="0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.870150 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.897004 4822 scope.go:117] "RemoveContainer" containerID="eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce" Oct 10 06:47:46 crc kubenswrapper[4822]: E1010 06:47:46.897641 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce\": container with ID starting with eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce not found: ID does not exist" containerID="eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.898054 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce"} err="failed to get container status \"eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce\": rpc error: code = NotFound desc = could not find container \"eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce\": container with ID starting with eb5aefc1800a93d8d8b7b78d493e4f2fce9f4c709060183a1be8e13f22fca3ce not found: ID does not exist" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.898087 4822 scope.go:117] "RemoveContainer" containerID="b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a" Oct 10 06:47:46 crc kubenswrapper[4822]: E1010 06:47:46.906664 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a\": container with ID starting with b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a not found: ID does not exist" containerID="b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.906743 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a"} err="failed to get container status \"b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a\": rpc error: code = NotFound desc = could not find container \"b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a\": container with ID starting with b8350f6028cd61a94953e8f38fb0bbff4356971bd8a38c2b56c75360d4330b7a not found: ID does not exist" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.906796 4822 scope.go:117] "RemoveContainer" containerID="0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f" Oct 10 06:47:46 crc kubenswrapper[4822]: E1010 06:47:46.907293 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f\": container with ID starting with 0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f not found: ID does not exist" containerID="0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f" Oct 10 06:47:46 crc kubenswrapper[4822]: I1010 06:47:46.907317 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f"} err="failed to get container status \"0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f\": rpc error: code = NotFound desc = could not find container \"0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f\": container with ID starting with 0a3206776587d3fb85aaf24de9c96b06e6010b4bb3b038351848ed53bf7f866f not found: ID does not exist" Oct 10 06:47:47 crc kubenswrapper[4822]: I1010 06:47:47.031333 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9981e04-0357-4ea3-a845-0a8cc3eb476f" (UID: "e9981e04-0357-4ea3-a845-0a8cc3eb476f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:47 crc kubenswrapper[4822]: I1010 06:47:47.101531 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9981e04-0357-4ea3-a845-0a8cc3eb476f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:47 crc kubenswrapper[4822]: I1010 06:47:47.155765 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmvls"] Oct 10 06:47:47 crc kubenswrapper[4822]: I1010 06:47:47.163838 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zmvls"] Oct 10 06:47:47 crc kubenswrapper[4822]: I1010 06:47:47.664947 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" path="/var/lib/kubelet/pods/e9981e04-0357-4ea3-a845-0a8cc3eb476f/volumes" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.184501 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tgqvc"] Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.185010 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tgqvc" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="registry-server" containerID="cri-o://6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1" gracePeriod=2 Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.561187 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.735866 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlkpw\" (UniqueName: \"kubernetes.io/projected/b2fb9277-764a-45d8-b2ec-f662b12ecc33-kube-api-access-xlkpw\") pod \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.736725 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-utilities\") pod \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.736883 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-catalog-content\") pod \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\" (UID: \"b2fb9277-764a-45d8-b2ec-f662b12ecc33\") " Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.737839 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-utilities" (OuterVolumeSpecName: "utilities") pod "b2fb9277-764a-45d8-b2ec-f662b12ecc33" (UID: "b2fb9277-764a-45d8-b2ec-f662b12ecc33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.743738 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2fb9277-764a-45d8-b2ec-f662b12ecc33-kube-api-access-xlkpw" (OuterVolumeSpecName: "kube-api-access-xlkpw") pod "b2fb9277-764a-45d8-b2ec-f662b12ecc33" (UID: "b2fb9277-764a-45d8-b2ec-f662b12ecc33"). InnerVolumeSpecName "kube-api-access-xlkpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.794620 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2fb9277-764a-45d8-b2ec-f662b12ecc33" (UID: "b2fb9277-764a-45d8-b2ec-f662b12ecc33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.838504 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlkpw\" (UniqueName: \"kubernetes.io/projected/b2fb9277-764a-45d8-b2ec-f662b12ecc33-kube-api-access-xlkpw\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.838573 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.838596 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fb9277-764a-45d8-b2ec-f662b12ecc33-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.860919 4822 generic.go:334] "Generic (PLEG): container finished" podID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerID="6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1" exitCode=0 Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.861011 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgqvc" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.861029 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgqvc" event={"ID":"b2fb9277-764a-45d8-b2ec-f662b12ecc33","Type":"ContainerDied","Data":"6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1"} Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.861083 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgqvc" event={"ID":"b2fb9277-764a-45d8-b2ec-f662b12ecc33","Type":"ContainerDied","Data":"02cae38e4b31eef85238fc8ae7a3cc28ae40f653911b58d0f947f394702b37cd"} Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.861122 4822 scope.go:117] "RemoveContainer" containerID="6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.893251 4822 scope.go:117] "RemoveContainer" containerID="7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.908451 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tgqvc"] Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.913715 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tgqvc"] Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.929780 4822 scope.go:117] "RemoveContainer" containerID="8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.964303 4822 scope.go:117] "RemoveContainer" containerID="6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1" Oct 10 06:47:49 crc kubenswrapper[4822]: E1010 06:47:49.965101 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1\": container with ID starting with 6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1 not found: ID does not exist" containerID="6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.965155 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1"} err="failed to get container status \"6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1\": rpc error: code = NotFound desc = could not find container \"6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1\": container with ID starting with 6ec95831c4ca7c941e4981fb7a06c962b42e67f7b95eeb899e026aafded0f5e1 not found: ID does not exist" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.965186 4822 scope.go:117] "RemoveContainer" containerID="7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f" Oct 10 06:47:49 crc kubenswrapper[4822]: E1010 06:47:49.965762 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f\": container with ID starting with 7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f not found: ID does not exist" containerID="7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.965794 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f"} err="failed to get container status \"7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f\": rpc error: code = NotFound desc = could not find container \"7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f\": container with ID starting with 7e88992638cfaa888c9fa2ac1cf55afc1520bcc4336103d3c3fc7d31569f678f not found: ID does not exist" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.965834 4822 scope.go:117] "RemoveContainer" containerID="8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648" Oct 10 06:47:49 crc kubenswrapper[4822]: E1010 06:47:49.966329 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648\": container with ID starting with 8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648 not found: ID does not exist" containerID="8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648" Oct 10 06:47:49 crc kubenswrapper[4822]: I1010 06:47:49.966418 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648"} err="failed to get container status \"8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648\": rpc error: code = NotFound desc = could not find container \"8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648\": container with ID starting with 8c8a93439b1ffe80ffdfceeebe61451156a201aa55d360eb636efc8baa9ff648 not found: ID does not exist" Oct 10 06:47:51 crc kubenswrapper[4822]: I1010 06:47:51.668248 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" path="/var/lib/kubelet/pods/b2fb9277-764a-45d8-b2ec-f662b12ecc33/volumes" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.141185 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w7tsj"] Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142158 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142175 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142196 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="extract-content" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142204 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="extract-content" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142220 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="extract-utilities" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142229 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="extract-utilities" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142245 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142253 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142266 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="extract-content" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142273 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="extract-content" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142287 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="extract-utilities" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142295 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="extract-utilities" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142314 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="extract-content" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142322 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="extract-content" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142338 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142346 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142360 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc72727a-70e5-402e-90f1-2c54c48dd5f8" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142369 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc72727a-70e5-402e-90f1-2c54c48dd5f8" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142379 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07072cb-ae19-4dcb-9f52-432fe923949d" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142386 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07072cb-ae19-4dcb-9f52-432fe923949d" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142401 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="extract-utilities" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142409 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="extract-utilities" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142418 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52264dc7-4118-484f-ab12-1bfd17172c20" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142428 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="52264dc7-4118-484f-ab12-1bfd17172c20" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: E1010 06:48:01.142444 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142453 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142646 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2fb9277-764a-45d8-b2ec-f662b12ecc33" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142670 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca446713-8b21-4ad1-8068-3460c5dad716" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142681 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9981e04-0357-4ea3-a845-0a8cc3eb476f" containerName="registry-server" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142695 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc72727a-70e5-402e-90f1-2c54c48dd5f8" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142711 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="52264dc7-4118-484f-ab12-1bfd17172c20" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142727 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="caeaf30d-9a5b-423c-bf9a-eaaa9351ec86" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.142740 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07072cb-ae19-4dcb-9f52-432fe923949d" containerName="mariadb-account-delete" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.144971 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.151612 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7tsj"] Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.307320 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-catalog-content\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.307480 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74l9\" (UniqueName: \"kubernetes.io/projected/31fa3618-b06d-4258-bf82-82d1871c0a31-kube-api-access-z74l9\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.307958 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-utilities\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.408795 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-catalog-content\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.408946 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74l9\" (UniqueName: \"kubernetes.io/projected/31fa3618-b06d-4258-bf82-82d1871c0a31-kube-api-access-z74l9\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.408987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-utilities\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.409440 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-catalog-content\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.409477 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-utilities\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.435905 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74l9\" (UniqueName: \"kubernetes.io/projected/31fa3618-b06d-4258-bf82-82d1871c0a31-kube-api-access-z74l9\") pod \"community-operators-w7tsj\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.466017 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.979006 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7tsj"] Oct 10 06:48:01 crc kubenswrapper[4822]: I1010 06:48:01.993132 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerStarted","Data":"8dc57aa0381c67d9a7708d257cb69a7e6c4948d6066e12bba6c3dd6a0dbd1815"} Oct 10 06:48:03 crc kubenswrapper[4822]: I1010 06:48:03.008560 4822 generic.go:334] "Generic (PLEG): container finished" podID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerID="b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982" exitCode=0 Oct 10 06:48:03 crc kubenswrapper[4822]: I1010 06:48:03.008783 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerDied","Data":"b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982"} Oct 10 06:48:04 crc kubenswrapper[4822]: I1010 06:48:04.018050 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerStarted","Data":"c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a"} Oct 10 06:48:05 crc kubenswrapper[4822]: I1010 06:48:05.029534 4822 generic.go:334] "Generic (PLEG): container finished" podID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerID="c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a" exitCode=0 Oct 10 06:48:05 crc kubenswrapper[4822]: I1010 06:48:05.029583 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerDied","Data":"c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a"} Oct 10 06:48:06 crc kubenswrapper[4822]: I1010 06:48:06.043223 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerStarted","Data":"88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e"} Oct 10 06:48:06 crc kubenswrapper[4822]: I1010 06:48:06.067890 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w7tsj" podStartSLOduration=2.613575823 podStartE2EDuration="5.067871205s" podCreationTimestamp="2025-10-10 06:48:01 +0000 UTC" firstStartedPulling="2025-10-10 06:48:03.01071926 +0000 UTC m=+1430.105877476" lastFinishedPulling="2025-10-10 06:48:05.465014672 +0000 UTC m=+1432.560172858" observedRunningTime="2025-10-10 06:48:06.06003471 +0000 UTC m=+1433.155192936" watchObservedRunningTime="2025-10-10 06:48:06.067871205 +0000 UTC m=+1433.163029401" Oct 10 06:48:11 crc kubenswrapper[4822]: I1010 06:48:11.467876 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:11 crc kubenswrapper[4822]: I1010 06:48:11.470951 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:11 crc kubenswrapper[4822]: I1010 06:48:11.545736 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:12 crc kubenswrapper[4822]: I1010 06:48:12.173218 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:12 crc kubenswrapper[4822]: I1010 06:48:12.224590 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7tsj"] Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.117536 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w7tsj" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="registry-server" containerID="cri-o://88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e" gracePeriod=2 Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.586362 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.716629 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-catalog-content\") pod \"31fa3618-b06d-4258-bf82-82d1871c0a31\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.716682 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z74l9\" (UniqueName: \"kubernetes.io/projected/31fa3618-b06d-4258-bf82-82d1871c0a31-kube-api-access-z74l9\") pod \"31fa3618-b06d-4258-bf82-82d1871c0a31\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.716785 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-utilities\") pod \"31fa3618-b06d-4258-bf82-82d1871c0a31\" (UID: \"31fa3618-b06d-4258-bf82-82d1871c0a31\") " Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.717860 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-utilities" (OuterVolumeSpecName: "utilities") pod "31fa3618-b06d-4258-bf82-82d1871c0a31" (UID: "31fa3618-b06d-4258-bf82-82d1871c0a31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.721430 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa3618-b06d-4258-bf82-82d1871c0a31-kube-api-access-z74l9" (OuterVolumeSpecName: "kube-api-access-z74l9") pod "31fa3618-b06d-4258-bf82-82d1871c0a31" (UID: "31fa3618-b06d-4258-bf82-82d1871c0a31"). InnerVolumeSpecName "kube-api-access-z74l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.769008 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fa3618-b06d-4258-bf82-82d1871c0a31" (UID: "31fa3618-b06d-4258-bf82-82d1871c0a31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.818950 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.819239 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa3618-b06d-4258-bf82-82d1871c0a31-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:48:14 crc kubenswrapper[4822]: I1010 06:48:14.819313 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z74l9\" (UniqueName: \"kubernetes.io/projected/31fa3618-b06d-4258-bf82-82d1871c0a31-kube-api-access-z74l9\") on node \"crc\" DevicePath \"\"" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.129449 4822 generic.go:334] "Generic (PLEG): container finished" podID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerID="88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e" exitCode=0 Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.129489 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerDied","Data":"88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e"} Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.130369 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7tsj" event={"ID":"31fa3618-b06d-4258-bf82-82d1871c0a31","Type":"ContainerDied","Data":"8dc57aa0381c67d9a7708d257cb69a7e6c4948d6066e12bba6c3dd6a0dbd1815"} Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.130430 4822 scope.go:117] "RemoveContainer" containerID="88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.129508 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7tsj" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.174271 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7tsj"] Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.179710 4822 scope.go:117] "RemoveContainer" containerID="c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.181136 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w7tsj"] Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.203142 4822 scope.go:117] "RemoveContainer" containerID="b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.237450 4822 scope.go:117] "RemoveContainer" containerID="88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e" Oct 10 06:48:15 crc kubenswrapper[4822]: E1010 06:48:15.237983 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e\": container with ID starting with 88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e not found: ID does not exist" containerID="88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.238022 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e"} err="failed to get container status \"88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e\": rpc error: code = NotFound desc = could not find container \"88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e\": container with ID starting with 88bf31894f54646cfc220e3f7e7675e794f52f4997c321c0513dd804de06f93e not found: ID does not exist" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.238048 4822 scope.go:117] "RemoveContainer" containerID="c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a" Oct 10 06:48:15 crc kubenswrapper[4822]: E1010 06:48:15.238336 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a\": container with ID starting with c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a not found: ID does not exist" containerID="c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.238429 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a"} err="failed to get container status \"c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a\": rpc error: code = NotFound desc = could not find container \"c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a\": container with ID starting with c73492f516f6e7a95410d70b94fa30dbfe6cdea77b2c4d7637dc803baf2c916a not found: ID does not exist" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.238503 4822 scope.go:117] "RemoveContainer" containerID="b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982" Oct 10 06:48:15 crc kubenswrapper[4822]: E1010 06:48:15.238887 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982\": container with ID starting with b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982 not found: ID does not exist" containerID="b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.238930 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982"} err="failed to get container status \"b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982\": rpc error: code = NotFound desc = could not find container \"b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982\": container with ID starting with b0d867c894915fcf3f7c3bbf6db23bc8fe62d3898a579bdf1724f7f4333f2982 not found: ID does not exist" Oct 10 06:48:15 crc kubenswrapper[4822]: I1010 06:48:15.676611 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" path="/var/lib/kubelet/pods/31fa3618-b06d-4258-bf82-82d1871c0a31/volumes" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.115580 4822 scope.go:117] "RemoveContainer" containerID="d7ad001d0d4b160f08d794d8ff2e892d3b85e7807420a3abded31fb954c6d8ca" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.151337 4822 scope.go:117] "RemoveContainer" containerID="cd98bfb27b3e9b37541c6b1c1099ebf97943f6cf600c4013ec6157d2daade251" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.180090 4822 scope.go:117] "RemoveContainer" containerID="52abefdc1abbbdf1d6e084d55a389e7f29cd378d32aa689238a2bdc490e4f2a0" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.213153 4822 scope.go:117] "RemoveContainer" containerID="eea57be423b28bbdd8158cba171eb7c5cb45f3b6969aa019c1c3bb74047f15a3" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.236730 4822 scope.go:117] "RemoveContainer" containerID="5ba2ef9118bef44e8786bbc493f68dcdd5f9957d059443a5b0396944fd668c2a" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.257385 4822 scope.go:117] "RemoveContainer" containerID="5233ce6a762e0754c515f119b497b8ddc51c7df347fc9453f4cfb86c35c752f7" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.285652 4822 scope.go:117] "RemoveContainer" containerID="ff8720c0716057614e5e35ff1ba277e8d17564bb9d481e502d8d5c8094332b53" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.310162 4822 scope.go:117] "RemoveContainer" containerID="cea6be947bf52e2474312230db60ef4f19b18c57064bc9913aa1efa9a6406d53" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.345069 4822 scope.go:117] "RemoveContainer" containerID="d368a9283a474f279f3a9e7eb440163641f3b6b196b241c7c7adf0aa687133b9" Oct 10 06:48:19 crc kubenswrapper[4822]: I1010 06:48:19.370148 4822 scope.go:117] "RemoveContainer" containerID="d43cccdc5c656da0d889ca4eaba683a10c7359c580ef7744e83ca9ceb5fcb33a" Oct 10 06:49:01 crc kubenswrapper[4822]: I1010 06:49:01.337257 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:49:01 crc kubenswrapper[4822]: I1010 06:49:01.338120 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.676764 4822 scope.go:117] "RemoveContainer" containerID="dd2a30b180ec632a6893d74f2fdd44fa6c7795beb5d5790d7be79ba27adf7c63" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.712587 4822 scope.go:117] "RemoveContainer" containerID="81f118e491ac50d56dc726f1e8d71d2fc29cd767138066b7b8edf50a7d2f3ad0" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.738128 4822 scope.go:117] "RemoveContainer" containerID="0141cd260ea12894ec8962858e0a4a76f6db81c7131a7c678f8f6be121b43c25" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.763509 4822 scope.go:117] "RemoveContainer" containerID="7e7cf45d4577b3ec598e972f0f930d14f2f5e126f34531bbd3ee99e6c96bb7e2" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.785427 4822 scope.go:117] "RemoveContainer" containerID="505f7482009dbc32caf5f0d8b1c400586173489bde97d410893531db26d3a1f3" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.825765 4822 scope.go:117] "RemoveContainer" containerID="ebfd8d72e608053f57e097ccab816b8741ef0bdde0a682093445b6f0a526d64a" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.849195 4822 scope.go:117] "RemoveContainer" containerID="dcb45fed241501e193ccf3f61f716249836d1f162c81bc32a9c507e3c8334169" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.865051 4822 scope.go:117] "RemoveContainer" containerID="c07fb29fe58638b86ce3b23d65282f35d480c438268a6fc34c0920e7d3904766" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.899536 4822 scope.go:117] "RemoveContainer" containerID="df44840d8672a385cb99037b5349829505288dc4ab0c0ddee1e133f2fee5d4e1" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.916282 4822 scope.go:117] "RemoveContainer" containerID="00b110eb03f7b216ee678097c29a4872da8d3e2c4317e8feb9cf5c6c2b771324" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.934176 4822 scope.go:117] "RemoveContainer" containerID="fd01c0ab2957061061b8b59ccbdfb85e03a9a4c49b19f380974e1975346c0050" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.952381 4822 scope.go:117] "RemoveContainer" containerID="43e47a1651cc659ca237db97e80e625b70db9b6622c2585b63079c7df3b404bd" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.974044 4822 scope.go:117] "RemoveContainer" containerID="1ef82f2ec478186ad8805b22fb1086bf5ed62287bffc865e7603d99c71db4881" Oct 10 06:49:19 crc kubenswrapper[4822]: I1010 06:49:19.996972 4822 scope.go:117] "RemoveContainer" containerID="a829a2721fe99b524e5ca7cb2318d1332bb2968f94f80cd142fd5a74891aa843" Oct 10 06:49:20 crc kubenswrapper[4822]: I1010 06:49:20.020444 4822 scope.go:117] "RemoveContainer" containerID="5d027935eca48c7d38a1daa3d0aabc0470acb3d9c745f8b2626dd36d25168207" Oct 10 06:49:20 crc kubenswrapper[4822]: I1010 06:49:20.056414 4822 scope.go:117] "RemoveContainer" containerID="024bb9bd5ec88357a6f36b69c390ab51145b978575520dba76b734934c4cb667" Oct 10 06:49:31 crc kubenswrapper[4822]: I1010 06:49:31.336615 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:49:31 crc kubenswrapper[4822]: I1010 06:49:31.337457 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:50:01 crc kubenswrapper[4822]: I1010 06:50:01.336464 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:50:01 crc kubenswrapper[4822]: I1010 06:50:01.337633 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:50:01 crc kubenswrapper[4822]: I1010 06:50:01.337735 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:50:01 crc kubenswrapper[4822]: I1010 06:50:01.339041 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:50:01 crc kubenswrapper[4822]: I1010 06:50:01.339161 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" gracePeriod=600 Oct 10 06:50:01 crc kubenswrapper[4822]: E1010 06:50:01.465096 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:50:02 crc kubenswrapper[4822]: I1010 06:50:02.160491 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" exitCode=0 Oct 10 06:50:02 crc kubenswrapper[4822]: I1010 06:50:02.160525 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b"} Oct 10 06:50:02 crc kubenswrapper[4822]: I1010 06:50:02.160588 4822 scope.go:117] "RemoveContainer" containerID="006f8177d378a019f93f20e514e90a0268748c4dd87f7ee989c03c088b0112a8" Oct 10 06:50:02 crc kubenswrapper[4822]: I1010 06:50:02.161201 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:50:02 crc kubenswrapper[4822]: E1010 06:50:02.161476 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:50:16 crc kubenswrapper[4822]: I1010 06:50:16.651519 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:50:16 crc kubenswrapper[4822]: E1010 06:50:16.652730 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.322791 4822 scope.go:117] "RemoveContainer" containerID="9e8a166db7e6684c2f70810fcd5fd1e2bda563d063aea0131ffcf24561a3576f" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.351997 4822 scope.go:117] "RemoveContainer" containerID="f83cde6f4a70de9b355f7b282554b988566370270eca0205f68d4daaaf187345" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.370006 4822 scope.go:117] "RemoveContainer" containerID="9928b4bb5c7c846b4d03b70516ed5ca072768b4d3aa864a47b1164b201b3d55f" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.389603 4822 scope.go:117] "RemoveContainer" containerID="5768417b026ddc15a2a8c2d6da91a0f1f8ec8b7c89708cc85cf21fe86db42db9" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.412763 4822 scope.go:117] "RemoveContainer" containerID="3e7d276d1d3b3fb474bd5e9b35156a0cb9b7aeb19b8240310ff38e0f5a17fcf9" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.430332 4822 scope.go:117] "RemoveContainer" containerID="f839fda56f2538723eb3a69164f746c8534c1810606224e142e99f555c972316" Oct 10 06:50:20 crc kubenswrapper[4822]: I1010 06:50:20.453464 4822 scope.go:117] "RemoveContainer" containerID="83166d2b466bbc334a7a76449841bb9af7bb6da59d0f938b0df0671307d8888b" Oct 10 06:50:30 crc kubenswrapper[4822]: I1010 06:50:30.650253 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:50:30 crc kubenswrapper[4822]: E1010 06:50:30.651008 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:50:45 crc kubenswrapper[4822]: I1010 06:50:45.650269 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:50:45 crc kubenswrapper[4822]: E1010 06:50:45.651116 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:51:00 crc kubenswrapper[4822]: I1010 06:51:00.650178 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:51:00 crc kubenswrapper[4822]: E1010 06:51:00.651049 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:51:14 crc kubenswrapper[4822]: I1010 06:51:14.649739 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:51:14 crc kubenswrapper[4822]: E1010 06:51:14.650433 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.571235 4822 scope.go:117] "RemoveContainer" containerID="df00ac7cd3316056516aa5090f987440f7eb094e9585dd62542db77e5346e015" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.595171 4822 scope.go:117] "RemoveContainer" containerID="d316ea091d0d3fd74e111a4ac49ba83ff68275a343bdfa35aaa4a262cd3d37ff" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.615153 4822 scope.go:117] "RemoveContainer" containerID="83b64c5e02ef1f3e662badc32279e52bb5de1266e576765efe6ed3ac8baae657" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.653429 4822 scope.go:117] "RemoveContainer" containerID="e4e5f571245440132f9f5f730db406d6b9a307ca592fcc9f54d08211bd30cd68" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.691063 4822 scope.go:117] "RemoveContainer" containerID="164f4d17f67c30c9f98fb7fe3fea3e4c324c7f1a7a425e0992fdc9a6d2c119e1" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.717979 4822 scope.go:117] "RemoveContainer" containerID="2236649086e06115ee6f2c2d259dcd51713877ec4cfa78014abba7e69c828377" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.735093 4822 scope.go:117] "RemoveContainer" containerID="53d9ccc3b552464384024fafe8b9e9476990f0e498f5251b3ebd572bda3dd7c2" Oct 10 06:51:20 crc kubenswrapper[4822]: I1010 06:51:20.759426 4822 scope.go:117] "RemoveContainer" containerID="c6d5c971df658fefb60ce2392b3efcf4416ced783ebf37b679584284d13e2d5f" Oct 10 06:51:27 crc kubenswrapper[4822]: I1010 06:51:27.650477 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:51:27 crc kubenswrapper[4822]: E1010 06:51:27.651050 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:51:41 crc kubenswrapper[4822]: I1010 06:51:41.650903 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:51:41 crc kubenswrapper[4822]: E1010 06:51:41.651581 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:51:54 crc kubenswrapper[4822]: I1010 06:51:54.649969 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:51:54 crc kubenswrapper[4822]: E1010 06:51:54.651679 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:52:09 crc kubenswrapper[4822]: I1010 06:52:09.650862 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:52:09 crc kubenswrapper[4822]: E1010 06:52:09.651593 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:52:20 crc kubenswrapper[4822]: I1010 06:52:20.912548 4822 scope.go:117] "RemoveContainer" containerID="c7baa3dcfc861f3f0c1cd8711d2244d5d331b16f63e69a27aa93f37eaaee5a7d" Oct 10 06:52:20 crc kubenswrapper[4822]: I1010 06:52:20.934255 4822 scope.go:117] "RemoveContainer" containerID="9c50f2c0fe0d406dc24edf889d7edf03b140f900ca3e39af887f67f157e22dca" Oct 10 06:52:20 crc kubenswrapper[4822]: I1010 06:52:20.975535 4822 scope.go:117] "RemoveContainer" containerID="4e20c2d87e457b4faea64c3dea709aa2f899b6ef000fddafdc23dfca5479df95" Oct 10 06:52:21 crc kubenswrapper[4822]: I1010 06:52:21.014427 4822 scope.go:117] "RemoveContainer" containerID="baea442fa07cf4f678380e3207aef11e9c69e1b72f190fc82c22d6fed63d0641" Oct 10 06:52:22 crc kubenswrapper[4822]: I1010 06:52:22.650281 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:52:22 crc kubenswrapper[4822]: E1010 06:52:22.650674 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:52:37 crc kubenswrapper[4822]: I1010 06:52:37.650017 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:52:37 crc kubenswrapper[4822]: E1010 06:52:37.650668 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:52:52 crc kubenswrapper[4822]: I1010 06:52:52.650174 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:52:52 crc kubenswrapper[4822]: E1010 06:52:52.650985 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:53:06 crc kubenswrapper[4822]: I1010 06:53:06.650612 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:53:06 crc kubenswrapper[4822]: E1010 06:53:06.651263 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:53:20 crc kubenswrapper[4822]: I1010 06:53:20.652520 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:53:20 crc kubenswrapper[4822]: E1010 06:53:20.654441 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:53:21 crc kubenswrapper[4822]: I1010 06:53:21.101698 4822 scope.go:117] "RemoveContainer" containerID="2670d1035b9acdfd45f5637b300e37d6a4c39796a1a60e3494ae45d0369ae659" Oct 10 06:53:21 crc kubenswrapper[4822]: I1010 06:53:21.123005 4822 scope.go:117] "RemoveContainer" containerID="f10e534bb95fa2ebc7af37b8031bdbf1a65fb1ad6946670369531ead85fae14a" Oct 10 06:53:21 crc kubenswrapper[4822]: I1010 06:53:21.142638 4822 scope.go:117] "RemoveContainer" containerID="7f58aba361d15a267262f8637de2ab720390632cc57a89026f070c3fadcc16d0" Oct 10 06:53:21 crc kubenswrapper[4822]: I1010 06:53:21.170146 4822 scope.go:117] "RemoveContainer" containerID="81aaba24eab90858435b6b488bfc6f4ab6daf0328d75fc96b1283f66920ce6a0" Oct 10 06:53:31 crc kubenswrapper[4822]: I1010 06:53:31.650231 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:53:31 crc kubenswrapper[4822]: E1010 06:53:31.652257 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:53:45 crc kubenswrapper[4822]: I1010 06:53:45.650788 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:53:45 crc kubenswrapper[4822]: E1010 06:53:45.654915 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:53:59 crc kubenswrapper[4822]: I1010 06:53:59.650142 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:53:59 crc kubenswrapper[4822]: E1010 06:53:59.651251 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:54:11 crc kubenswrapper[4822]: I1010 06:54:11.650941 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:54:11 crc kubenswrapper[4822]: E1010 06:54:11.651752 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:54:23 crc kubenswrapper[4822]: I1010 06:54:23.655862 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:54:23 crc kubenswrapper[4822]: E1010 06:54:23.658966 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:54:38 crc kubenswrapper[4822]: I1010 06:54:38.650388 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:54:38 crc kubenswrapper[4822]: E1010 06:54:38.651153 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:54:50 crc kubenswrapper[4822]: I1010 06:54:50.650043 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:54:50 crc kubenswrapper[4822]: E1010 06:54:50.650845 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 06:55:03 crc kubenswrapper[4822]: I1010 06:55:03.655537 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:55:04 crc kubenswrapper[4822]: I1010 06:55:04.518236 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"2bc198fe1ff3f2eede967da6159370cf9340396862a2d02c50b9cbd44ba5bdc1"} Oct 10 06:57:31 crc kubenswrapper[4822]: I1010 06:57:31.336566 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:57:31 crc kubenswrapper[4822]: I1010 06:57:31.337299 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:58:01 crc kubenswrapper[4822]: I1010 06:58:01.336630 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:58:01 crc kubenswrapper[4822]: I1010 06:58:01.337261 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:58:31 crc kubenswrapper[4822]: I1010 06:58:31.336506 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:58:31 crc kubenswrapper[4822]: I1010 06:58:31.337100 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:58:31 crc kubenswrapper[4822]: I1010 06:58:31.337139 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 06:58:31 crc kubenswrapper[4822]: I1010 06:58:31.337678 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bc198fe1ff3f2eede967da6159370cf9340396862a2d02c50b9cbd44ba5bdc1"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:58:31 crc kubenswrapper[4822]: I1010 06:58:31.337722 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://2bc198fe1ff3f2eede967da6159370cf9340396862a2d02c50b9cbd44ba5bdc1" gracePeriod=600 Oct 10 06:58:32 crc kubenswrapper[4822]: I1010 06:58:32.112585 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="2bc198fe1ff3f2eede967da6159370cf9340396862a2d02c50b9cbd44ba5bdc1" exitCode=0 Oct 10 06:58:32 crc kubenswrapper[4822]: I1010 06:58:32.112652 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"2bc198fe1ff3f2eede967da6159370cf9340396862a2d02c50b9cbd44ba5bdc1"} Oct 10 06:58:32 crc kubenswrapper[4822]: I1010 06:58:32.113261 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc"} Oct 10 06:58:32 crc kubenswrapper[4822]: I1010 06:58:32.113283 4822 scope.go:117] "RemoveContainer" containerID="d67bdfe030e956ec0dea224c6366aca42d93707621177086dbdff7c417fe239b" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.623535 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cxd6n"] Oct 10 06:58:48 crc kubenswrapper[4822]: E1010 06:58:48.624379 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="extract-content" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.624393 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="extract-content" Oct 10 06:58:48 crc kubenswrapper[4822]: E1010 06:58:48.624408 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="extract-utilities" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.624417 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="extract-utilities" Oct 10 06:58:48 crc kubenswrapper[4822]: E1010 06:58:48.624438 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="registry-server" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.624447 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="registry-server" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.624611 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="31fa3618-b06d-4258-bf82-82d1871c0a31" containerName="registry-server" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.625768 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.639243 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cxd6n"] Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.708410 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-utilities\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.708458 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshzx\" (UniqueName: \"kubernetes.io/projected/0434bd72-1726-4468-96bb-3d54147d0e0c-kube-api-access-mshzx\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.708488 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-catalog-content\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.809380 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-utilities\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.809433 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshzx\" (UniqueName: \"kubernetes.io/projected/0434bd72-1726-4468-96bb-3d54147d0e0c-kube-api-access-mshzx\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.809485 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-catalog-content\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.810366 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-utilities\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.810453 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-catalog-content\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.832302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshzx\" (UniqueName: \"kubernetes.io/projected/0434bd72-1726-4468-96bb-3d54147d0e0c-kube-api-access-mshzx\") pod \"community-operators-cxd6n\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:48 crc kubenswrapper[4822]: I1010 06:58:48.949217 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:49 crc kubenswrapper[4822]: I1010 06:58:49.410633 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cxd6n"] Oct 10 06:58:50 crc kubenswrapper[4822]: I1010 06:58:50.264214 4822 generic.go:334] "Generic (PLEG): container finished" podID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerID="887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65" exitCode=0 Oct 10 06:58:50 crc kubenswrapper[4822]: I1010 06:58:50.264302 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerDied","Data":"887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65"} Oct 10 06:58:50 crc kubenswrapper[4822]: I1010 06:58:50.266737 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 06:58:50 crc kubenswrapper[4822]: I1010 06:58:50.267269 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerStarted","Data":"1f0e42798915eb19a589e83304d538884380050f5dc03ef4d9250e69da6070f4"} Oct 10 06:58:51 crc kubenswrapper[4822]: I1010 06:58:51.275912 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerStarted","Data":"50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78"} Oct 10 06:58:52 crc kubenswrapper[4822]: I1010 06:58:52.285494 4822 generic.go:334] "Generic (PLEG): container finished" podID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerID="50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78" exitCode=0 Oct 10 06:58:52 crc kubenswrapper[4822]: I1010 06:58:52.285554 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerDied","Data":"50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78"} Oct 10 06:58:53 crc kubenswrapper[4822]: I1010 06:58:53.297440 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerStarted","Data":"5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31"} Oct 10 06:58:58 crc kubenswrapper[4822]: I1010 06:58:58.950600 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:58 crc kubenswrapper[4822]: I1010 06:58:58.951192 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:58 crc kubenswrapper[4822]: I1010 06:58:58.999694 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:59 crc kubenswrapper[4822]: I1010 06:58:59.022720 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cxd6n" podStartSLOduration=8.546438703 podStartE2EDuration="11.022573757s" podCreationTimestamp="2025-10-10 06:58:48 +0000 UTC" firstStartedPulling="2025-10-10 06:58:50.266341218 +0000 UTC m=+2077.361499434" lastFinishedPulling="2025-10-10 06:58:52.742476252 +0000 UTC m=+2079.837634488" observedRunningTime="2025-10-10 06:58:53.321199628 +0000 UTC m=+2080.416357844" watchObservedRunningTime="2025-10-10 06:58:59.022573757 +0000 UTC m=+2086.117731953" Oct 10 06:58:59 crc kubenswrapper[4822]: I1010 06:58:59.399006 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:58:59 crc kubenswrapper[4822]: I1010 06:58:59.455651 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cxd6n"] Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.357370 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cxd6n" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="registry-server" containerID="cri-o://5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31" gracePeriod=2 Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.761563 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.910920 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-catalog-content\") pod \"0434bd72-1726-4468-96bb-3d54147d0e0c\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.911020 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mshzx\" (UniqueName: \"kubernetes.io/projected/0434bd72-1726-4468-96bb-3d54147d0e0c-kube-api-access-mshzx\") pod \"0434bd72-1726-4468-96bb-3d54147d0e0c\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.911067 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-utilities\") pod \"0434bd72-1726-4468-96bb-3d54147d0e0c\" (UID: \"0434bd72-1726-4468-96bb-3d54147d0e0c\") " Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.912135 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-utilities" (OuterVolumeSpecName: "utilities") pod "0434bd72-1726-4468-96bb-3d54147d0e0c" (UID: "0434bd72-1726-4468-96bb-3d54147d0e0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:59:01 crc kubenswrapper[4822]: I1010 06:59:01.916677 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0434bd72-1726-4468-96bb-3d54147d0e0c-kube-api-access-mshzx" (OuterVolumeSpecName: "kube-api-access-mshzx") pod "0434bd72-1726-4468-96bb-3d54147d0e0c" (UID: "0434bd72-1726-4468-96bb-3d54147d0e0c"). InnerVolumeSpecName "kube-api-access-mshzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.013360 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mshzx\" (UniqueName: \"kubernetes.io/projected/0434bd72-1726-4468-96bb-3d54147d0e0c-kube-api-access-mshzx\") on node \"crc\" DevicePath \"\"" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.013397 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.348253 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0434bd72-1726-4468-96bb-3d54147d0e0c" (UID: "0434bd72-1726-4468-96bb-3d54147d0e0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.365850 4822 generic.go:334] "Generic (PLEG): container finished" podID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerID="5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31" exitCode=0 Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.365901 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerDied","Data":"5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31"} Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.365934 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cxd6n" event={"ID":"0434bd72-1726-4468-96bb-3d54147d0e0c","Type":"ContainerDied","Data":"1f0e42798915eb19a589e83304d538884380050f5dc03ef4d9250e69da6070f4"} Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.365954 4822 scope.go:117] "RemoveContainer" containerID="5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.366091 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cxd6n" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.404910 4822 scope.go:117] "RemoveContainer" containerID="50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.414236 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cxd6n"] Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.419327 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0434bd72-1726-4468-96bb-3d54147d0e0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.420360 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cxd6n"] Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.423458 4822 scope.go:117] "RemoveContainer" containerID="887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.454270 4822 scope.go:117] "RemoveContainer" containerID="5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31" Oct 10 06:59:02 crc kubenswrapper[4822]: E1010 06:59:02.454682 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31\": container with ID starting with 5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31 not found: ID does not exist" containerID="5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.454735 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31"} err="failed to get container status \"5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31\": rpc error: code = NotFound desc = could not find container \"5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31\": container with ID starting with 5c47d67f695ebdc0ddd4358e276eef68e16f6979aed63c0f6ce5f4dcbb760d31 not found: ID does not exist" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.454768 4822 scope.go:117] "RemoveContainer" containerID="50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78" Oct 10 06:59:02 crc kubenswrapper[4822]: E1010 06:59:02.455127 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78\": container with ID starting with 50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78 not found: ID does not exist" containerID="50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.455172 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78"} err="failed to get container status \"50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78\": rpc error: code = NotFound desc = could not find container \"50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78\": container with ID starting with 50b125869c152028542f7404e57c76c801affef713b67356e4de5f703ab81b78 not found: ID does not exist" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.455199 4822 scope.go:117] "RemoveContainer" containerID="887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65" Oct 10 06:59:02 crc kubenswrapper[4822]: E1010 06:59:02.455450 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65\": container with ID starting with 887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65 not found: ID does not exist" containerID="887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65" Oct 10 06:59:02 crc kubenswrapper[4822]: I1010 06:59:02.455488 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65"} err="failed to get container status \"887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65\": rpc error: code = NotFound desc = could not find container \"887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65\": container with ID starting with 887ad7b9ab2765ecd4a96b4bbdfb84f5912b7425c221260b8b6ab42b7e1d3c65 not found: ID does not exist" Oct 10 06:59:03 crc kubenswrapper[4822]: I1010 06:59:03.659592 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" path="/var/lib/kubelet/pods/0434bd72-1726-4468-96bb-3d54147d0e0c/volumes" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.152837 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp"] Oct 10 07:00:00 crc kubenswrapper[4822]: E1010 07:00:00.153722 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="extract-content" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.153743 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="extract-content" Oct 10 07:00:00 crc kubenswrapper[4822]: E1010 07:00:00.153770 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="extract-utilities" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.153781 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="extract-utilities" Oct 10 07:00:00 crc kubenswrapper[4822]: E1010 07:00:00.153820 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="registry-server" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.153832 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="registry-server" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.154471 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0434bd72-1726-4468-96bb-3d54147d0e0c" containerName="registry-server" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.155246 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.159131 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.159569 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.164756 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp"] Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.241632 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-config-volume\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.242299 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6bwj\" (UniqueName: \"kubernetes.io/projected/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-kube-api-access-c6bwj\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.242482 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-secret-volume\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.344058 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-secret-volume\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.344120 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-config-volume\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.344174 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6bwj\" (UniqueName: \"kubernetes.io/projected/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-kube-api-access-c6bwj\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.345286 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-config-volume\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.350932 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-secret-volume\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.361196 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6bwj\" (UniqueName: \"kubernetes.io/projected/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-kube-api-access-c6bwj\") pod \"collect-profiles-29334660-77gvp\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.486283 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:00 crc kubenswrapper[4822]: I1010 07:00:00.960934 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp"] Oct 10 07:00:01 crc kubenswrapper[4822]: I1010 07:00:01.809999 4822 generic.go:334] "Generic (PLEG): container finished" podID="0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" containerID="bcb25ed755addb1e134da06154c34459f4a5c0bd8dae051dce5d4c69d16ff75d" exitCode=0 Oct 10 07:00:01 crc kubenswrapper[4822]: I1010 07:00:01.810083 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" event={"ID":"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1","Type":"ContainerDied","Data":"bcb25ed755addb1e134da06154c34459f4a5c0bd8dae051dce5d4c69d16ff75d"} Oct 10 07:00:01 crc kubenswrapper[4822]: I1010 07:00:01.810128 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" event={"ID":"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1","Type":"ContainerStarted","Data":"988271c92e099d72f20520642a7a7bfda54d5b0cd4e799a77ffb83c74dae903b"} Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.098575 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.186518 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-config-volume\") pod \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.186598 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-secret-volume\") pod \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.186737 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6bwj\" (UniqueName: \"kubernetes.io/projected/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-kube-api-access-c6bwj\") pod \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\" (UID: \"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1\") " Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.187584 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" (UID: "0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.199988 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" (UID: "0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.199992 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-kube-api-access-c6bwj" (OuterVolumeSpecName: "kube-api-access-c6bwj") pod "0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" (UID: "0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1"). InnerVolumeSpecName "kube-api-access-c6bwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.289873 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6bwj\" (UniqueName: \"kubernetes.io/projected/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-kube-api-access-c6bwj\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.289919 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.289931 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.830348 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" event={"ID":"0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1","Type":"ContainerDied","Data":"988271c92e099d72f20520642a7a7bfda54d5b0cd4e799a77ffb83c74dae903b"} Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.830767 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="988271c92e099d72f20520642a7a7bfda54d5b0cd4e799a77ffb83c74dae903b" Oct 10 07:00:03 crc kubenswrapper[4822]: I1010 07:00:03.830465 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp" Oct 10 07:00:04 crc kubenswrapper[4822]: I1010 07:00:04.167058 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx"] Oct 10 07:00:04 crc kubenswrapper[4822]: I1010 07:00:04.172126 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334615-ssmbx"] Oct 10 07:00:05 crc kubenswrapper[4822]: I1010 07:00:05.658777 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df5f09d-0d1c-40cf-9041-695d831d552d" path="/var/lib/kubelet/pods/2df5f09d-0d1c-40cf-9041-695d831d552d/volumes" Oct 10 07:00:21 crc kubenswrapper[4822]: I1010 07:00:21.353570 4822 scope.go:117] "RemoveContainer" containerID="4028de656a268672b0a3b3809074bd83441264c07fc5342630745c98b932cd3b" Oct 10 07:00:31 crc kubenswrapper[4822]: I1010 07:00:31.336863 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:00:31 crc kubenswrapper[4822]: I1010 07:00:31.337380 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:01:01 crc kubenswrapper[4822]: I1010 07:01:01.336468 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:01:01 crc kubenswrapper[4822]: I1010 07:01:01.337126 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.610327 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n2z8s"] Oct 10 07:01:02 crc kubenswrapper[4822]: E1010 07:01:02.610621 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" containerName="collect-profiles" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.610637 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" containerName="collect-profiles" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.610788 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" containerName="collect-profiles" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.611866 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.624932 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2z8s"] Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.749979 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-utilities\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.750397 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zfw\" (UniqueName: \"kubernetes.io/projected/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-kube-api-access-q7zfw\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.750552 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-catalog-content\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.851708 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zfw\" (UniqueName: \"kubernetes.io/projected/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-kube-api-access-q7zfw\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.851790 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-catalog-content\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.851841 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-utilities\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.852342 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-utilities\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.852898 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-catalog-content\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.887712 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zfw\" (UniqueName: \"kubernetes.io/projected/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-kube-api-access-q7zfw\") pod \"certified-operators-n2z8s\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:02 crc kubenswrapper[4822]: I1010 07:01:02.938743 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.230513 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bstbp"] Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.237079 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.251237 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2z8s"] Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.262881 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bstbp"] Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.267286 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56xww\" (UniqueName: \"kubernetes.io/projected/abdeb1d2-7ca0-455f-b069-6dbd974e487a-kube-api-access-56xww\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.268180 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-utilities\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.268339 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-catalog-content\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.320654 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2z8s" event={"ID":"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3","Type":"ContainerStarted","Data":"79d44a958d24a54057d6bcb9a1bface5e2df9804240b3ad14007a4b46030097f"} Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.370657 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56xww\" (UniqueName: \"kubernetes.io/projected/abdeb1d2-7ca0-455f-b069-6dbd974e487a-kube-api-access-56xww\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.370764 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-utilities\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.370871 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-catalog-content\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.371434 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-utilities\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.371653 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-catalog-content\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.393546 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56xww\" (UniqueName: \"kubernetes.io/projected/abdeb1d2-7ca0-455f-b069-6dbd974e487a-kube-api-access-56xww\") pod \"redhat-marketplace-bstbp\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.586457 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:03 crc kubenswrapper[4822]: I1010 07:01:03.810301 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bstbp"] Oct 10 07:01:04 crc kubenswrapper[4822]: I1010 07:01:04.332244 4822 generic.go:334] "Generic (PLEG): container finished" podID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerID="22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357" exitCode=0 Oct 10 07:01:04 crc kubenswrapper[4822]: I1010 07:01:04.332353 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2z8s" event={"ID":"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3","Type":"ContainerDied","Data":"22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357"} Oct 10 07:01:04 crc kubenswrapper[4822]: I1010 07:01:04.333601 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerStarted","Data":"d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e"} Oct 10 07:01:04 crc kubenswrapper[4822]: I1010 07:01:04.333650 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerStarted","Data":"4e28ff8fd1ba0d352ec326bcfdd8b10e305aeafdc0080c0ee35e0f8d605f804e"} Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.014560 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d695n"] Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.017771 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.028350 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d695n"] Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.197223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-utilities\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.197280 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-catalog-content\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.197430 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f269\" (UniqueName: \"kubernetes.io/projected/4083ee03-12fe-4b78-86e2-9568eba873d4-kube-api-access-5f269\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.298469 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f269\" (UniqueName: \"kubernetes.io/projected/4083ee03-12fe-4b78-86e2-9568eba873d4-kube-api-access-5f269\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.298901 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-utilities\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.298932 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-catalog-content\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.299347 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-utilities\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.299705 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-catalog-content\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.320044 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f269\" (UniqueName: \"kubernetes.io/projected/4083ee03-12fe-4b78-86e2-9568eba873d4-kube-api-access-5f269\") pod \"redhat-operators-d695n\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.342455 4822 generic.go:334] "Generic (PLEG): container finished" podID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerID="d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e" exitCode=0 Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.342518 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerDied","Data":"d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e"} Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.351656 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:05 crc kubenswrapper[4822]: I1010 07:01:05.836429 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d695n"] Oct 10 07:01:06 crc kubenswrapper[4822]: E1010 07:01:06.116875 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4083ee03_12fe_4b78_86e2_9568eba873d4.slice/crio-conmon-05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4083ee03_12fe_4b78_86e2_9568eba873d4.slice/crio-05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.351208 4822 generic.go:334] "Generic (PLEG): container finished" podID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerID="93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991" exitCode=0 Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.351310 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerDied","Data":"93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991"} Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.353823 4822 generic.go:334] "Generic (PLEG): container finished" podID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerID="05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1" exitCode=0 Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.353930 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerDied","Data":"05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1"} Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.354046 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerStarted","Data":"05e7392c4c0163112db746743af633dc8b1576e045686e0ec7d9fb69bf4cb81a"} Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.358207 4822 generic.go:334] "Generic (PLEG): container finished" podID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerID="7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5" exitCode=0 Oct 10 07:01:06 crc kubenswrapper[4822]: I1010 07:01:06.358262 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2z8s" event={"ID":"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3","Type":"ContainerDied","Data":"7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5"} Oct 10 07:01:07 crc kubenswrapper[4822]: I1010 07:01:07.372957 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerStarted","Data":"6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae"} Oct 10 07:01:07 crc kubenswrapper[4822]: I1010 07:01:07.379074 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2z8s" event={"ID":"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3","Type":"ContainerStarted","Data":"78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84"} Oct 10 07:01:07 crc kubenswrapper[4822]: I1010 07:01:07.382365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerStarted","Data":"afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84"} Oct 10 07:01:07 crc kubenswrapper[4822]: I1010 07:01:07.412261 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bstbp" podStartSLOduration=1.61262097 podStartE2EDuration="4.412242109s" podCreationTimestamp="2025-10-10 07:01:03 +0000 UTC" firstStartedPulling="2025-10-10 07:01:04.334924241 +0000 UTC m=+2211.430082447" lastFinishedPulling="2025-10-10 07:01:07.13454539 +0000 UTC m=+2214.229703586" observedRunningTime="2025-10-10 07:01:07.410130158 +0000 UTC m=+2214.505288344" watchObservedRunningTime="2025-10-10 07:01:07.412242109 +0000 UTC m=+2214.507400305" Oct 10 07:01:07 crc kubenswrapper[4822]: I1010 07:01:07.431332 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n2z8s" podStartSLOduration=2.876409338 podStartE2EDuration="5.431301259s" podCreationTimestamp="2025-10-10 07:01:02 +0000 UTC" firstStartedPulling="2025-10-10 07:01:04.334251362 +0000 UTC m=+2211.429409548" lastFinishedPulling="2025-10-10 07:01:06.889143273 +0000 UTC m=+2213.984301469" observedRunningTime="2025-10-10 07:01:07.429715744 +0000 UTC m=+2214.524873940" watchObservedRunningTime="2025-10-10 07:01:07.431301259 +0000 UTC m=+2214.526459455" Oct 10 07:01:08 crc kubenswrapper[4822]: I1010 07:01:08.392565 4822 generic.go:334] "Generic (PLEG): container finished" podID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerID="6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae" exitCode=0 Oct 10 07:01:08 crc kubenswrapper[4822]: I1010 07:01:08.392644 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerDied","Data":"6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae"} Oct 10 07:01:09 crc kubenswrapper[4822]: I1010 07:01:09.401748 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerStarted","Data":"0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06"} Oct 10 07:01:09 crc kubenswrapper[4822]: I1010 07:01:09.425277 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d695n" podStartSLOduration=2.820995506 podStartE2EDuration="5.425251322s" podCreationTimestamp="2025-10-10 07:01:04 +0000 UTC" firstStartedPulling="2025-10-10 07:01:06.355715209 +0000 UTC m=+2213.450873405" lastFinishedPulling="2025-10-10 07:01:08.959971025 +0000 UTC m=+2216.055129221" observedRunningTime="2025-10-10 07:01:09.42102198 +0000 UTC m=+2216.516180206" watchObservedRunningTime="2025-10-10 07:01:09.425251322 +0000 UTC m=+2216.520409518" Oct 10 07:01:12 crc kubenswrapper[4822]: I1010 07:01:12.939049 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:12 crc kubenswrapper[4822]: I1010 07:01:12.940550 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:12 crc kubenswrapper[4822]: I1010 07:01:12.987437 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:13 crc kubenswrapper[4822]: I1010 07:01:13.468441 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:13 crc kubenswrapper[4822]: I1010 07:01:13.587227 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:13 crc kubenswrapper[4822]: I1010 07:01:13.587300 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:13 crc kubenswrapper[4822]: I1010 07:01:13.628757 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:13 crc kubenswrapper[4822]: I1010 07:01:13.798773 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2z8s"] Oct 10 07:01:14 crc kubenswrapper[4822]: I1010 07:01:14.510754 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.352080 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.352175 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.425306 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.450737 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n2z8s" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="registry-server" containerID="cri-o://78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84" gracePeriod=2 Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.498082 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.955545 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:15 crc kubenswrapper[4822]: I1010 07:01:15.999453 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bstbp"] Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.075683 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-utilities\") pod \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.075756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-catalog-content\") pod \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.075890 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7zfw\" (UniqueName: \"kubernetes.io/projected/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-kube-api-access-q7zfw\") pod \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\" (UID: \"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3\") " Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.076884 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-utilities" (OuterVolumeSpecName: "utilities") pod "03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" (UID: "03ad4e3e-3cd8-499f-96eb-bc121a1df1a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.088086 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-kube-api-access-q7zfw" (OuterVolumeSpecName: "kube-api-access-q7zfw") pod "03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" (UID: "03ad4e3e-3cd8-499f-96eb-bc121a1df1a3"). InnerVolumeSpecName "kube-api-access-q7zfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.177704 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.177753 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7zfw\" (UniqueName: \"kubernetes.io/projected/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-kube-api-access-q7zfw\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.463789 4822 generic.go:334] "Generic (PLEG): container finished" podID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerID="78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84" exitCode=0 Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.464078 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bstbp" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="registry-server" containerID="cri-o://afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84" gracePeriod=2 Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.464429 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2z8s" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.464846 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2z8s" event={"ID":"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3","Type":"ContainerDied","Data":"78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84"} Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.464878 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2z8s" event={"ID":"03ad4e3e-3cd8-499f-96eb-bc121a1df1a3","Type":"ContainerDied","Data":"79d44a958d24a54057d6bcb9a1bface5e2df9804240b3ad14007a4b46030097f"} Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.464894 4822 scope.go:117] "RemoveContainer" containerID="78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.485921 4822 scope.go:117] "RemoveContainer" containerID="7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.508558 4822 scope.go:117] "RemoveContainer" containerID="22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.532190 4822 scope.go:117] "RemoveContainer" containerID="78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84" Oct 10 07:01:16 crc kubenswrapper[4822]: E1010 07:01:16.532635 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84\": container with ID starting with 78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84 not found: ID does not exist" containerID="78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.532666 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84"} err="failed to get container status \"78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84\": rpc error: code = NotFound desc = could not find container \"78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84\": container with ID starting with 78935e54a316e45da2cc7ebbce8437ef9dc90ab6b3de06e92f745d10e1f05d84 not found: ID does not exist" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.532688 4822 scope.go:117] "RemoveContainer" containerID="7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5" Oct 10 07:01:16 crc kubenswrapper[4822]: E1010 07:01:16.533048 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5\": container with ID starting with 7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5 not found: ID does not exist" containerID="7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.533105 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5"} err="failed to get container status \"7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5\": rpc error: code = NotFound desc = could not find container \"7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5\": container with ID starting with 7af279518e001adcecbb3c310711470a7ed32425fb1879dd3337b1b2d7b11fc5 not found: ID does not exist" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.533144 4822 scope.go:117] "RemoveContainer" containerID="22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357" Oct 10 07:01:16 crc kubenswrapper[4822]: E1010 07:01:16.533506 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357\": container with ID starting with 22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357 not found: ID does not exist" containerID="22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.533538 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357"} err="failed to get container status \"22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357\": rpc error: code = NotFound desc = could not find container \"22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357\": container with ID starting with 22097efc9f47ecb797baee676d74d8dfc10fbdcb08241c891e5e014c9933b357 not found: ID does not exist" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.821757 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" (UID: "03ad4e3e-3cd8-499f-96eb-bc121a1df1a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:01:16 crc kubenswrapper[4822]: I1010 07:01:16.887619 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.097236 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2z8s"] Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.105246 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n2z8s"] Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.332663 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.393310 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-catalog-content\") pod \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.393445 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-utilities\") pod \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.393529 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56xww\" (UniqueName: \"kubernetes.io/projected/abdeb1d2-7ca0-455f-b069-6dbd974e487a-kube-api-access-56xww\") pod \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\" (UID: \"abdeb1d2-7ca0-455f-b069-6dbd974e487a\") " Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.394375 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-utilities" (OuterVolumeSpecName: "utilities") pod "abdeb1d2-7ca0-455f-b069-6dbd974e487a" (UID: "abdeb1d2-7ca0-455f-b069-6dbd974e487a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.398392 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdeb1d2-7ca0-455f-b069-6dbd974e487a-kube-api-access-56xww" (OuterVolumeSpecName: "kube-api-access-56xww") pod "abdeb1d2-7ca0-455f-b069-6dbd974e487a" (UID: "abdeb1d2-7ca0-455f-b069-6dbd974e487a"). InnerVolumeSpecName "kube-api-access-56xww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.409633 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abdeb1d2-7ca0-455f-b069-6dbd974e487a" (UID: "abdeb1d2-7ca0-455f-b069-6dbd974e487a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.477028 4822 generic.go:334] "Generic (PLEG): container finished" podID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerID="afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84" exitCode=0 Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.477128 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bstbp" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.477381 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerDied","Data":"afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84"} Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.477450 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bstbp" event={"ID":"abdeb1d2-7ca0-455f-b069-6dbd974e487a","Type":"ContainerDied","Data":"4e28ff8fd1ba0d352ec326bcfdd8b10e305aeafdc0080c0ee35e0f8d605f804e"} Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.477473 4822 scope.go:117] "RemoveContainer" containerID="afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.495430 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.495488 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56xww\" (UniqueName: \"kubernetes.io/projected/abdeb1d2-7ca0-455f-b069-6dbd974e487a-kube-api-access-56xww\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.495513 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abdeb1d2-7ca0-455f-b069-6dbd974e487a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.495851 4822 scope.go:117] "RemoveContainer" containerID="93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.511108 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bstbp"] Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.515226 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bstbp"] Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.545971 4822 scope.go:117] "RemoveContainer" containerID="d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.560044 4822 scope.go:117] "RemoveContainer" containerID="afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84" Oct 10 07:01:17 crc kubenswrapper[4822]: E1010 07:01:17.560549 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84\": container with ID starting with afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84 not found: ID does not exist" containerID="afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.560616 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84"} err="failed to get container status \"afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84\": rpc error: code = NotFound desc = could not find container \"afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84\": container with ID starting with afba2ad7afd016644384abb67646949018947cfe4bc5051e1583631da145cb84 not found: ID does not exist" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.560647 4822 scope.go:117] "RemoveContainer" containerID="93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991" Oct 10 07:01:17 crc kubenswrapper[4822]: E1010 07:01:17.561007 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991\": container with ID starting with 93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991 not found: ID does not exist" containerID="93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.561047 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991"} err="failed to get container status \"93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991\": rpc error: code = NotFound desc = could not find container \"93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991\": container with ID starting with 93a6637723c9d2450dced6faa5f6364f56bf728872deb43880d5c8386cf78991 not found: ID does not exist" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.561071 4822 scope.go:117] "RemoveContainer" containerID="d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e" Oct 10 07:01:17 crc kubenswrapper[4822]: E1010 07:01:17.561316 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e\": container with ID starting with d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e not found: ID does not exist" containerID="d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.561362 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e"} err="failed to get container status \"d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e\": rpc error: code = NotFound desc = could not find container \"d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e\": container with ID starting with d0bf73befe3fd6b75c5f5ca040a64683987dfbb6c59bc5c496637a292256834e not found: ID does not exist" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.658345 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" path="/var/lib/kubelet/pods/03ad4e3e-3cd8-499f-96eb-bc121a1df1a3/volumes" Oct 10 07:01:17 crc kubenswrapper[4822]: I1010 07:01:17.659029 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" path="/var/lib/kubelet/pods/abdeb1d2-7ca0-455f-b069-6dbd974e487a/volumes" Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.393718 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d695n"] Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.393960 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d695n" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="registry-server" containerID="cri-o://0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06" gracePeriod=2 Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.800779 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.927396 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-catalog-content\") pod \"4083ee03-12fe-4b78-86e2-9568eba873d4\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.927458 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-utilities\") pod \"4083ee03-12fe-4b78-86e2-9568eba873d4\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.927485 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f269\" (UniqueName: \"kubernetes.io/projected/4083ee03-12fe-4b78-86e2-9568eba873d4-kube-api-access-5f269\") pod \"4083ee03-12fe-4b78-86e2-9568eba873d4\" (UID: \"4083ee03-12fe-4b78-86e2-9568eba873d4\") " Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.929522 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-utilities" (OuterVolumeSpecName: "utilities") pod "4083ee03-12fe-4b78-86e2-9568eba873d4" (UID: "4083ee03-12fe-4b78-86e2-9568eba873d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:01:18 crc kubenswrapper[4822]: I1010 07:01:18.931118 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4083ee03-12fe-4b78-86e2-9568eba873d4-kube-api-access-5f269" (OuterVolumeSpecName: "kube-api-access-5f269") pod "4083ee03-12fe-4b78-86e2-9568eba873d4" (UID: "4083ee03-12fe-4b78-86e2-9568eba873d4"). InnerVolumeSpecName "kube-api-access-5f269". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.029076 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.029114 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f269\" (UniqueName: \"kubernetes.io/projected/4083ee03-12fe-4b78-86e2-9568eba873d4-kube-api-access-5f269\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.029251 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4083ee03-12fe-4b78-86e2-9568eba873d4" (UID: "4083ee03-12fe-4b78-86e2-9568eba873d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.130406 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4083ee03-12fe-4b78-86e2-9568eba873d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.497885 4822 generic.go:334] "Generic (PLEG): container finished" podID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerID="0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06" exitCode=0 Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.497936 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerDied","Data":"0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06"} Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.497956 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d695n" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.498753 4822 scope.go:117] "RemoveContainer" containerID="0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.498692 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d695n" event={"ID":"4083ee03-12fe-4b78-86e2-9568eba873d4","Type":"ContainerDied","Data":"05e7392c4c0163112db746743af633dc8b1576e045686e0ec7d9fb69bf4cb81a"} Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.514941 4822 scope.go:117] "RemoveContainer" containerID="6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.535149 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d695n"] Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.545303 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d695n"] Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.556864 4822 scope.go:117] "RemoveContainer" containerID="05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.572970 4822 scope.go:117] "RemoveContainer" containerID="0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06" Oct 10 07:01:19 crc kubenswrapper[4822]: E1010 07:01:19.573479 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06\": container with ID starting with 0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06 not found: ID does not exist" containerID="0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.573551 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06"} err="failed to get container status \"0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06\": rpc error: code = NotFound desc = could not find container \"0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06\": container with ID starting with 0c0efb9a4aa691d95200d3340a37fdaac26e11eeb3cde7c79c14950c52c76a06 not found: ID does not exist" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.573573 4822 scope.go:117] "RemoveContainer" containerID="6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae" Oct 10 07:01:19 crc kubenswrapper[4822]: E1010 07:01:19.573791 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae\": container with ID starting with 6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae not found: ID does not exist" containerID="6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.573873 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae"} err="failed to get container status \"6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae\": rpc error: code = NotFound desc = could not find container \"6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae\": container with ID starting with 6bc567cd71f957051ab778c410910872ed58ca5fa5639c62549aa9a7bd1504ae not found: ID does not exist" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.573942 4822 scope.go:117] "RemoveContainer" containerID="05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1" Oct 10 07:01:19 crc kubenswrapper[4822]: E1010 07:01:19.574314 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1\": container with ID starting with 05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1 not found: ID does not exist" containerID="05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.574382 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1"} err="failed to get container status \"05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1\": rpc error: code = NotFound desc = could not find container \"05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1\": container with ID starting with 05246ef0bbf764cd9f58ecd087811976f2265562124c6adde5d46005d47a79c1 not found: ID does not exist" Oct 10 07:01:19 crc kubenswrapper[4822]: I1010 07:01:19.659032 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" path="/var/lib/kubelet/pods/4083ee03-12fe-4b78-86e2-9568eba873d4/volumes" Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.336863 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.337449 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.337498 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.338244 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.338313 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" gracePeriod=600 Oct 10 07:01:31 crc kubenswrapper[4822]: E1010 07:01:31.464227 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.618976 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" exitCode=0 Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.619038 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc"} Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.619078 4822 scope.go:117] "RemoveContainer" containerID="2bc198fe1ff3f2eede967da6159370cf9340396862a2d02c50b9cbd44ba5bdc1" Oct 10 07:01:31 crc kubenswrapper[4822]: I1010 07:01:31.619685 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:01:31 crc kubenswrapper[4822]: E1010 07:01:31.620356 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:01:44 crc kubenswrapper[4822]: I1010 07:01:44.651044 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:01:44 crc kubenswrapper[4822]: E1010 07:01:44.652181 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:01:56 crc kubenswrapper[4822]: I1010 07:01:56.651301 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:01:56 crc kubenswrapper[4822]: E1010 07:01:56.652399 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:02:09 crc kubenswrapper[4822]: I1010 07:02:09.650115 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:02:09 crc kubenswrapper[4822]: E1010 07:02:09.650789 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:02:24 crc kubenswrapper[4822]: I1010 07:02:24.650418 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:02:24 crc kubenswrapper[4822]: E1010 07:02:24.651234 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:02:36 crc kubenswrapper[4822]: I1010 07:02:36.650157 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:02:36 crc kubenswrapper[4822]: E1010 07:02:36.650849 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:02:50 crc kubenswrapper[4822]: I1010 07:02:50.650876 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:02:50 crc kubenswrapper[4822]: E1010 07:02:50.651799 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:03:03 crc kubenswrapper[4822]: I1010 07:03:03.656342 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:03:03 crc kubenswrapper[4822]: E1010 07:03:03.657154 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:03:14 crc kubenswrapper[4822]: I1010 07:03:14.650744 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:03:14 crc kubenswrapper[4822]: E1010 07:03:14.651662 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:03:28 crc kubenswrapper[4822]: I1010 07:03:28.650738 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:03:28 crc kubenswrapper[4822]: E1010 07:03:28.651459 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:03:39 crc kubenswrapper[4822]: I1010 07:03:39.650696 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:03:39 crc kubenswrapper[4822]: E1010 07:03:39.651725 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:03:53 crc kubenswrapper[4822]: I1010 07:03:53.659218 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:03:53 crc kubenswrapper[4822]: E1010 07:03:53.660295 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:04:06 crc kubenswrapper[4822]: I1010 07:04:06.650994 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:04:06 crc kubenswrapper[4822]: E1010 07:04:06.651711 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:04:17 crc kubenswrapper[4822]: I1010 07:04:17.650083 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:04:17 crc kubenswrapper[4822]: E1010 07:04:17.652397 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:04:30 crc kubenswrapper[4822]: I1010 07:04:30.650883 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:04:30 crc kubenswrapper[4822]: E1010 07:04:30.651917 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:04:41 crc kubenswrapper[4822]: I1010 07:04:41.650329 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:04:41 crc kubenswrapper[4822]: E1010 07:04:41.651070 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:04:56 crc kubenswrapper[4822]: I1010 07:04:56.649962 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:04:56 crc kubenswrapper[4822]: E1010 07:04:56.650777 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:05:10 crc kubenswrapper[4822]: I1010 07:05:10.651929 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:05:10 crc kubenswrapper[4822]: E1010 07:05:10.652696 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:05:23 crc kubenswrapper[4822]: I1010 07:05:23.653492 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:05:23 crc kubenswrapper[4822]: E1010 07:05:23.654164 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:05:36 crc kubenswrapper[4822]: I1010 07:05:36.650661 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:05:36 crc kubenswrapper[4822]: E1010 07:05:36.652431 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:05:51 crc kubenswrapper[4822]: I1010 07:05:51.650140 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:05:51 crc kubenswrapper[4822]: E1010 07:05:51.651137 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:06:05 crc kubenswrapper[4822]: I1010 07:06:05.650468 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:06:05 crc kubenswrapper[4822]: E1010 07:06:05.651502 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:06:19 crc kubenswrapper[4822]: I1010 07:06:19.650766 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:06:19 crc kubenswrapper[4822]: E1010 07:06:19.651555 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:06:32 crc kubenswrapper[4822]: I1010 07:06:32.649931 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:06:32 crc kubenswrapper[4822]: I1010 07:06:32.961310 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"7e2bd9dbbe06c39b597a6734c6299d41102040c49b75b4afb9dc3ad5c1a4bfad"} Oct 10 07:09:01 crc kubenswrapper[4822]: I1010 07:09:01.336602 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:09:01 crc kubenswrapper[4822]: I1010 07:09:01.337165 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.425071 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8lzx9"] Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426033 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426045 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426059 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="extract-content" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426065 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="extract-content" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426076 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="extract-content" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426083 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="extract-content" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426095 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426101 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426128 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="extract-utilities" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426138 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="extract-utilities" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426146 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="extract-utilities" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426153 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="extract-utilities" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426164 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426171 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426183 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="extract-content" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426189 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="extract-content" Oct 10 07:09:13 crc kubenswrapper[4822]: E1010 07:09:13.426204 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="extract-utilities" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426210 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="extract-utilities" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426338 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4083ee03-12fe-4b78-86e2-9568eba873d4" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426354 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdeb1d2-7ca0-455f-b069-6dbd974e487a" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.426362 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ad4e3e-3cd8-499f-96eb-bc121a1df1a3" containerName="registry-server" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.428244 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.446031 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lzx9"] Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.500840 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-utilities\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.500887 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-catalog-content\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.500966 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tzqh\" (UniqueName: \"kubernetes.io/projected/adbad0fa-f35b-4bad-a3af-ec80955f8639-kube-api-access-5tzqh\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.601958 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tzqh\" (UniqueName: \"kubernetes.io/projected/adbad0fa-f35b-4bad-a3af-ec80955f8639-kube-api-access-5tzqh\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.602045 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-utilities\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.602084 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-catalog-content\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.602673 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-catalog-content\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.602748 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-utilities\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.631022 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tzqh\" (UniqueName: \"kubernetes.io/projected/adbad0fa-f35b-4bad-a3af-ec80955f8639-kube-api-access-5tzqh\") pod \"community-operators-8lzx9\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:13 crc kubenswrapper[4822]: I1010 07:09:13.755075 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:14 crc kubenswrapper[4822]: I1010 07:09:14.239990 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lzx9"] Oct 10 07:09:14 crc kubenswrapper[4822]: I1010 07:09:14.316293 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lzx9" event={"ID":"adbad0fa-f35b-4bad-a3af-ec80955f8639","Type":"ContainerStarted","Data":"57269a462e229dc8dcd10a8a2d7644a316b2ffc492b0a6d5249e506c6b5da951"} Oct 10 07:09:15 crc kubenswrapper[4822]: I1010 07:09:15.328160 4822 generic.go:334] "Generic (PLEG): container finished" podID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerID="66d10060aa4293df405246739342f5a67bfa4802cbd69b3c7e9499ab5e384312" exitCode=0 Oct 10 07:09:15 crc kubenswrapper[4822]: I1010 07:09:15.328256 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lzx9" event={"ID":"adbad0fa-f35b-4bad-a3af-ec80955f8639","Type":"ContainerDied","Data":"66d10060aa4293df405246739342f5a67bfa4802cbd69b3c7e9499ab5e384312"} Oct 10 07:09:15 crc kubenswrapper[4822]: I1010 07:09:15.332150 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:09:17 crc kubenswrapper[4822]: I1010 07:09:17.348321 4822 generic.go:334] "Generic (PLEG): container finished" podID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerID="b772c5bebe9ce94af9bfaab23d7028def2ebcabe1e3bdd67c4ba2ede86afcef3" exitCode=0 Oct 10 07:09:17 crc kubenswrapper[4822]: I1010 07:09:17.348425 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lzx9" event={"ID":"adbad0fa-f35b-4bad-a3af-ec80955f8639","Type":"ContainerDied","Data":"b772c5bebe9ce94af9bfaab23d7028def2ebcabe1e3bdd67c4ba2ede86afcef3"} Oct 10 07:09:18 crc kubenswrapper[4822]: I1010 07:09:18.361533 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lzx9" event={"ID":"adbad0fa-f35b-4bad-a3af-ec80955f8639","Type":"ContainerStarted","Data":"089ac21e93ed1284c1a55dae190a778ab0881e75e7025999a2fa6aee7e65b516"} Oct 10 07:09:18 crc kubenswrapper[4822]: I1010 07:09:18.381372 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8lzx9" podStartSLOduration=2.565169933 podStartE2EDuration="5.381356366s" podCreationTimestamp="2025-10-10 07:09:13 +0000 UTC" firstStartedPulling="2025-10-10 07:09:15.330440778 +0000 UTC m=+2702.425598974" lastFinishedPulling="2025-10-10 07:09:18.146627191 +0000 UTC m=+2705.241785407" observedRunningTime="2025-10-10 07:09:18.379466581 +0000 UTC m=+2705.474624797" watchObservedRunningTime="2025-10-10 07:09:18.381356366 +0000 UTC m=+2705.476514562" Oct 10 07:09:23 crc kubenswrapper[4822]: I1010 07:09:23.755760 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:23 crc kubenswrapper[4822]: I1010 07:09:23.756333 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:23 crc kubenswrapper[4822]: I1010 07:09:23.809591 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:24 crc kubenswrapper[4822]: I1010 07:09:24.447145 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:24 crc kubenswrapper[4822]: I1010 07:09:24.495796 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lzx9"] Oct 10 07:09:26 crc kubenswrapper[4822]: I1010 07:09:26.418358 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8lzx9" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="registry-server" containerID="cri-o://089ac21e93ed1284c1a55dae190a778ab0881e75e7025999a2fa6aee7e65b516" gracePeriod=2 Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.427364 4822 generic.go:334] "Generic (PLEG): container finished" podID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerID="089ac21e93ed1284c1a55dae190a778ab0881e75e7025999a2fa6aee7e65b516" exitCode=0 Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.427431 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lzx9" event={"ID":"adbad0fa-f35b-4bad-a3af-ec80955f8639","Type":"ContainerDied","Data":"089ac21e93ed1284c1a55dae190a778ab0881e75e7025999a2fa6aee7e65b516"} Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.484187 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.515727 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-utilities\") pod \"adbad0fa-f35b-4bad-a3af-ec80955f8639\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.518148 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-utilities" (OuterVolumeSpecName: "utilities") pod "adbad0fa-f35b-4bad-a3af-ec80955f8639" (UID: "adbad0fa-f35b-4bad-a3af-ec80955f8639"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.617221 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tzqh\" (UniqueName: \"kubernetes.io/projected/adbad0fa-f35b-4bad-a3af-ec80955f8639-kube-api-access-5tzqh\") pod \"adbad0fa-f35b-4bad-a3af-ec80955f8639\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.617349 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-catalog-content\") pod \"adbad0fa-f35b-4bad-a3af-ec80955f8639\" (UID: \"adbad0fa-f35b-4bad-a3af-ec80955f8639\") " Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.617611 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.625460 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbad0fa-f35b-4bad-a3af-ec80955f8639-kube-api-access-5tzqh" (OuterVolumeSpecName: "kube-api-access-5tzqh") pod "adbad0fa-f35b-4bad-a3af-ec80955f8639" (UID: "adbad0fa-f35b-4bad-a3af-ec80955f8639"). InnerVolumeSpecName "kube-api-access-5tzqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4822]: I1010 07:09:27.720191 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tzqh\" (UniqueName: \"kubernetes.io/projected/adbad0fa-f35b-4bad-a3af-ec80955f8639-kube-api-access-5tzqh\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:28 crc kubenswrapper[4822]: I1010 07:09:28.438720 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lzx9" event={"ID":"adbad0fa-f35b-4bad-a3af-ec80955f8639","Type":"ContainerDied","Data":"57269a462e229dc8dcd10a8a2d7644a316b2ffc492b0a6d5249e506c6b5da951"} Oct 10 07:09:28 crc kubenswrapper[4822]: I1010 07:09:28.439079 4822 scope.go:117] "RemoveContainer" containerID="089ac21e93ed1284c1a55dae190a778ab0881e75e7025999a2fa6aee7e65b516" Oct 10 07:09:28 crc kubenswrapper[4822]: I1010 07:09:28.438873 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lzx9" Oct 10 07:09:28 crc kubenswrapper[4822]: I1010 07:09:28.462389 4822 scope.go:117] "RemoveContainer" containerID="b772c5bebe9ce94af9bfaab23d7028def2ebcabe1e3bdd67c4ba2ede86afcef3" Oct 10 07:09:28 crc kubenswrapper[4822]: I1010 07:09:28.487180 4822 scope.go:117] "RemoveContainer" containerID="66d10060aa4293df405246739342f5a67bfa4802cbd69b3c7e9499ab5e384312" Oct 10 07:09:29 crc kubenswrapper[4822]: I1010 07:09:29.831683 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adbad0fa-f35b-4bad-a3af-ec80955f8639" (UID: "adbad0fa-f35b-4bad-a3af-ec80955f8639"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:09:29 crc kubenswrapper[4822]: I1010 07:09:29.853702 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbad0fa-f35b-4bad-a3af-ec80955f8639-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:29 crc kubenswrapper[4822]: I1010 07:09:29.981086 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lzx9"] Oct 10 07:09:29 crc kubenswrapper[4822]: I1010 07:09:29.985877 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8lzx9"] Oct 10 07:09:31 crc kubenswrapper[4822]: I1010 07:09:31.337177 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:09:31 crc kubenswrapper[4822]: I1010 07:09:31.337990 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:09:31 crc kubenswrapper[4822]: I1010 07:09:31.674633 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" path="/var/lib/kubelet/pods/adbad0fa-f35b-4bad-a3af-ec80955f8639/volumes" Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.336930 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.337552 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.337614 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.338410 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e2bd9dbbe06c39b597a6734c6299d41102040c49b75b4afb9dc3ad5c1a4bfad"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.338468 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://7e2bd9dbbe06c39b597a6734c6299d41102040c49b75b4afb9dc3ad5c1a4bfad" gracePeriod=600 Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.727226 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="7e2bd9dbbe06c39b597a6734c6299d41102040c49b75b4afb9dc3ad5c1a4bfad" exitCode=0 Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.727283 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"7e2bd9dbbe06c39b597a6734c6299d41102040c49b75b4afb9dc3ad5c1a4bfad"} Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.727612 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c"} Oct 10 07:10:01 crc kubenswrapper[4822]: I1010 07:10:01.727638 4822 scope.go:117] "RemoveContainer" containerID="24b21b704c7b94d3ccf56ba7f1cb9de85010d99ba6bde33b2bdda4c0894703bc" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.864787 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bp469"] Oct 10 07:11:06 crc kubenswrapper[4822]: E1010 07:11:06.865628 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="registry-server" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.865643 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="registry-server" Oct 10 07:11:06 crc kubenswrapper[4822]: E1010 07:11:06.865659 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="extract-utilities" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.865665 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="extract-utilities" Oct 10 07:11:06 crc kubenswrapper[4822]: E1010 07:11:06.865675 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="extract-content" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.865682 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="extract-content" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.865876 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbad0fa-f35b-4bad-a3af-ec80955f8639" containerName="registry-server" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.867042 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.878759 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bp469"] Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.947919 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-catalog-content\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.948008 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2bn\" (UniqueName: \"kubernetes.io/projected/8a95beb7-72f9-4c83-a1a4-8615312a8940-kube-api-access-4q2bn\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:06 crc kubenswrapper[4822]: I1010 07:11:06.948072 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-utilities\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.049579 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-utilities\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.049658 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-catalog-content\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.049701 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2bn\" (UniqueName: \"kubernetes.io/projected/8a95beb7-72f9-4c83-a1a4-8615312a8940-kube-api-access-4q2bn\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.050205 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-utilities\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.050249 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-catalog-content\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.077309 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2bn\" (UniqueName: \"kubernetes.io/projected/8a95beb7-72f9-4c83-a1a4-8615312a8940-kube-api-access-4q2bn\") pod \"redhat-operators-bp469\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.185519 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:07 crc kubenswrapper[4822]: I1010 07:11:07.631585 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bp469"] Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.281322 4822 generic.go:334] "Generic (PLEG): container finished" podID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerID="0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87" exitCode=0 Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.281405 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp469" event={"ID":"8a95beb7-72f9-4c83-a1a4-8615312a8940","Type":"ContainerDied","Data":"0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87"} Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.281483 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp469" event={"ID":"8a95beb7-72f9-4c83-a1a4-8615312a8940","Type":"ContainerStarted","Data":"72fde1b41cdb3f9894096ed9920538ff0b5c4bdc1a2495cecd7f3681551bce09"} Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.480435 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8tk96"] Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.482693 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.491402 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tk96"] Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.577102 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-catalog-content\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.577185 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-utilities\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.577237 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79kp\" (UniqueName: \"kubernetes.io/projected/c50e2f47-65d7-4a2e-a94c-245993f5d66e-kube-api-access-q79kp\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.679036 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-catalog-content\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.679126 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-utilities\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.679152 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79kp\" (UniqueName: \"kubernetes.io/projected/c50e2f47-65d7-4a2e-a94c-245993f5d66e-kube-api-access-q79kp\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.680064 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-catalog-content\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.680424 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-utilities\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.702764 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79kp\" (UniqueName: \"kubernetes.io/projected/c50e2f47-65d7-4a2e-a94c-245993f5d66e-kube-api-access-q79kp\") pod \"certified-operators-8tk96\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:08 crc kubenswrapper[4822]: I1010 07:11:08.814771 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:09 crc kubenswrapper[4822]: I1010 07:11:09.265439 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tk96"] Oct 10 07:11:09 crc kubenswrapper[4822]: W1010 07:11:09.280490 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50e2f47_65d7_4a2e_a94c_245993f5d66e.slice/crio-bd8bf613c3882029037ef19bcef567e108fed1e455b04aec2eba34a5d42b2093 WatchSource:0}: Error finding container bd8bf613c3882029037ef19bcef567e108fed1e455b04aec2eba34a5d42b2093: Status 404 returned error can't find the container with id bd8bf613c3882029037ef19bcef567e108fed1e455b04aec2eba34a5d42b2093 Oct 10 07:11:09 crc kubenswrapper[4822]: I1010 07:11:09.290064 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tk96" event={"ID":"c50e2f47-65d7-4a2e-a94c-245993f5d66e","Type":"ContainerStarted","Data":"bd8bf613c3882029037ef19bcef567e108fed1e455b04aec2eba34a5d42b2093"} Oct 10 07:11:10 crc kubenswrapper[4822]: I1010 07:11:10.299093 4822 generic.go:334] "Generic (PLEG): container finished" podID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerID="248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a" exitCode=0 Oct 10 07:11:10 crc kubenswrapper[4822]: I1010 07:11:10.299168 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tk96" event={"ID":"c50e2f47-65d7-4a2e-a94c-245993f5d66e","Type":"ContainerDied","Data":"248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a"} Oct 10 07:11:10 crc kubenswrapper[4822]: I1010 07:11:10.305141 4822 generic.go:334] "Generic (PLEG): container finished" podID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerID="3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32" exitCode=0 Oct 10 07:11:10 crc kubenswrapper[4822]: I1010 07:11:10.305182 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp469" event={"ID":"8a95beb7-72f9-4c83-a1a4-8615312a8940","Type":"ContainerDied","Data":"3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32"} Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.266571 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txfkb"] Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.268369 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.285990 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txfkb"] Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.328047 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp469" event={"ID":"8a95beb7-72f9-4c83-a1a4-8615312a8940","Type":"ContainerStarted","Data":"d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc"} Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.346159 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48f8\" (UniqueName: \"kubernetes.io/projected/201602ae-8b16-4f7a-96a0-a01ce3790ed2-kube-api-access-g48f8\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.346293 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-catalog-content\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.346535 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-utilities\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.357014 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bp469" podStartSLOduration=3.488367165 podStartE2EDuration="7.356993378s" podCreationTimestamp="2025-10-10 07:11:06 +0000 UTC" firstStartedPulling="2025-10-10 07:11:08.284139533 +0000 UTC m=+2815.379297729" lastFinishedPulling="2025-10-10 07:11:12.152765746 +0000 UTC m=+2819.247923942" observedRunningTime="2025-10-10 07:11:13.356144123 +0000 UTC m=+2820.451302329" watchObservedRunningTime="2025-10-10 07:11:13.356993378 +0000 UTC m=+2820.452151574" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.447757 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-catalog-content\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.447843 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-utilities\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.447891 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48f8\" (UniqueName: \"kubernetes.io/projected/201602ae-8b16-4f7a-96a0-a01ce3790ed2-kube-api-access-g48f8\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.448495 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-catalog-content\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.448604 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-utilities\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.469793 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48f8\" (UniqueName: \"kubernetes.io/projected/201602ae-8b16-4f7a-96a0-a01ce3790ed2-kube-api-access-g48f8\") pod \"redhat-marketplace-txfkb\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:13 crc kubenswrapper[4822]: I1010 07:11:13.583816 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:14 crc kubenswrapper[4822]: I1010 07:11:14.107369 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txfkb"] Oct 10 07:11:14 crc kubenswrapper[4822]: W1010 07:11:14.115562 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201602ae_8b16_4f7a_96a0_a01ce3790ed2.slice/crio-1ee25f47a71ebb330bb7954a43afc8f3a50c2835ac07e6792b84c45be294b7b9 WatchSource:0}: Error finding container 1ee25f47a71ebb330bb7954a43afc8f3a50c2835ac07e6792b84c45be294b7b9: Status 404 returned error can't find the container with id 1ee25f47a71ebb330bb7954a43afc8f3a50c2835ac07e6792b84c45be294b7b9 Oct 10 07:11:14 crc kubenswrapper[4822]: I1010 07:11:14.339324 4822 generic.go:334] "Generic (PLEG): container finished" podID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerID="ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348" exitCode=0 Oct 10 07:11:14 crc kubenswrapper[4822]: I1010 07:11:14.339389 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tk96" event={"ID":"c50e2f47-65d7-4a2e-a94c-245993f5d66e","Type":"ContainerDied","Data":"ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348"} Oct 10 07:11:14 crc kubenswrapper[4822]: I1010 07:11:14.345000 4822 generic.go:334] "Generic (PLEG): container finished" podID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerID="613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a" exitCode=0 Oct 10 07:11:14 crc kubenswrapper[4822]: I1010 07:11:14.345988 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txfkb" event={"ID":"201602ae-8b16-4f7a-96a0-a01ce3790ed2","Type":"ContainerDied","Data":"613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a"} Oct 10 07:11:14 crc kubenswrapper[4822]: I1010 07:11:14.346034 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txfkb" event={"ID":"201602ae-8b16-4f7a-96a0-a01ce3790ed2","Type":"ContainerStarted","Data":"1ee25f47a71ebb330bb7954a43afc8f3a50c2835ac07e6792b84c45be294b7b9"} Oct 10 07:11:15 crc kubenswrapper[4822]: I1010 07:11:15.354696 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tk96" event={"ID":"c50e2f47-65d7-4a2e-a94c-245993f5d66e","Type":"ContainerStarted","Data":"abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3"} Oct 10 07:11:15 crc kubenswrapper[4822]: I1010 07:11:15.375032 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8tk96" podStartSLOduration=2.8349236209999997 podStartE2EDuration="7.375015091s" podCreationTimestamp="2025-10-10 07:11:08 +0000 UTC" firstStartedPulling="2025-10-10 07:11:10.301327353 +0000 UTC m=+2817.396485559" lastFinishedPulling="2025-10-10 07:11:14.841418833 +0000 UTC m=+2821.936577029" observedRunningTime="2025-10-10 07:11:15.374711142 +0000 UTC m=+2822.469869348" watchObservedRunningTime="2025-10-10 07:11:15.375015091 +0000 UTC m=+2822.470173287" Oct 10 07:11:16 crc kubenswrapper[4822]: I1010 07:11:16.364939 4822 generic.go:334] "Generic (PLEG): container finished" podID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerID="103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef" exitCode=0 Oct 10 07:11:16 crc kubenswrapper[4822]: I1010 07:11:16.365076 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txfkb" event={"ID":"201602ae-8b16-4f7a-96a0-a01ce3790ed2","Type":"ContainerDied","Data":"103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef"} Oct 10 07:11:17 crc kubenswrapper[4822]: I1010 07:11:17.186282 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:17 crc kubenswrapper[4822]: I1010 07:11:17.186658 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:17 crc kubenswrapper[4822]: I1010 07:11:17.374302 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txfkb" event={"ID":"201602ae-8b16-4f7a-96a0-a01ce3790ed2","Type":"ContainerStarted","Data":"69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b"} Oct 10 07:11:18 crc kubenswrapper[4822]: I1010 07:11:18.240167 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bp469" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="registry-server" probeResult="failure" output=< Oct 10 07:11:18 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 07:11:18 crc kubenswrapper[4822]: > Oct 10 07:11:18 crc kubenswrapper[4822]: I1010 07:11:18.816310 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:18 crc kubenswrapper[4822]: I1010 07:11:18.816778 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:18 crc kubenswrapper[4822]: I1010 07:11:18.855972 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:18 crc kubenswrapper[4822]: I1010 07:11:18.884710 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txfkb" podStartSLOduration=3.3842495599999998 podStartE2EDuration="5.88468967s" podCreationTimestamp="2025-10-10 07:11:13 +0000 UTC" firstStartedPulling="2025-10-10 07:11:14.346663519 +0000 UTC m=+2821.441821715" lastFinishedPulling="2025-10-10 07:11:16.847103629 +0000 UTC m=+2823.942261825" observedRunningTime="2025-10-10 07:11:17.397087433 +0000 UTC m=+2824.492245649" watchObservedRunningTime="2025-10-10 07:11:18.88468967 +0000 UTC m=+2825.979847876" Oct 10 07:11:20 crc kubenswrapper[4822]: I1010 07:11:20.441008 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:21 crc kubenswrapper[4822]: I1010 07:11:21.064049 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tk96"] Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.410325 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8tk96" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="registry-server" containerID="cri-o://abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3" gracePeriod=2 Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.830837 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.895979 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-catalog-content\") pod \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.896055 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-utilities\") pod \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.896182 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q79kp\" (UniqueName: \"kubernetes.io/projected/c50e2f47-65d7-4a2e-a94c-245993f5d66e-kube-api-access-q79kp\") pod \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\" (UID: \"c50e2f47-65d7-4a2e-a94c-245993f5d66e\") " Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.897590 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-utilities" (OuterVolumeSpecName: "utilities") pod "c50e2f47-65d7-4a2e-a94c-245993f5d66e" (UID: "c50e2f47-65d7-4a2e-a94c-245993f5d66e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.903276 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50e2f47-65d7-4a2e-a94c-245993f5d66e-kube-api-access-q79kp" (OuterVolumeSpecName: "kube-api-access-q79kp") pod "c50e2f47-65d7-4a2e-a94c-245993f5d66e" (UID: "c50e2f47-65d7-4a2e-a94c-245993f5d66e"). InnerVolumeSpecName "kube-api-access-q79kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.947585 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c50e2f47-65d7-4a2e-a94c-245993f5d66e" (UID: "c50e2f47-65d7-4a2e-a94c-245993f5d66e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.998209 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.998264 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c50e2f47-65d7-4a2e-a94c-245993f5d66e-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:22 crc kubenswrapper[4822]: I1010 07:11:22.998284 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q79kp\" (UniqueName: \"kubernetes.io/projected/c50e2f47-65d7-4a2e-a94c-245993f5d66e-kube-api-access-q79kp\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.419286 4822 generic.go:334] "Generic (PLEG): container finished" podID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerID="abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3" exitCode=0 Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.419338 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tk96" event={"ID":"c50e2f47-65d7-4a2e-a94c-245993f5d66e","Type":"ContainerDied","Data":"abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3"} Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.419354 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tk96" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.419379 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tk96" event={"ID":"c50e2f47-65d7-4a2e-a94c-245993f5d66e","Type":"ContainerDied","Data":"bd8bf613c3882029037ef19bcef567e108fed1e455b04aec2eba34a5d42b2093"} Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.419405 4822 scope.go:117] "RemoveContainer" containerID="abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.438040 4822 scope.go:117] "RemoveContainer" containerID="ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.466002 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tk96"] Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.474323 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8tk96"] Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.478495 4822 scope.go:117] "RemoveContainer" containerID="248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.494320 4822 scope.go:117] "RemoveContainer" containerID="abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3" Oct 10 07:11:23 crc kubenswrapper[4822]: E1010 07:11:23.494651 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3\": container with ID starting with abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3 not found: ID does not exist" containerID="abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.494685 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3"} err="failed to get container status \"abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3\": rpc error: code = NotFound desc = could not find container \"abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3\": container with ID starting with abfc1175db5949cc537e64c16c80b540c2193659271f4cef8ee4a42a3fe94be3 not found: ID does not exist" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.494707 4822 scope.go:117] "RemoveContainer" containerID="ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348" Oct 10 07:11:23 crc kubenswrapper[4822]: E1010 07:11:23.494919 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348\": container with ID starting with ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348 not found: ID does not exist" containerID="ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.494944 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348"} err="failed to get container status \"ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348\": rpc error: code = NotFound desc = could not find container \"ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348\": container with ID starting with ac3254b10d3787e2d660836bc4aab1544d627d44759d40a7e266a292d2305348 not found: ID does not exist" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.494958 4822 scope.go:117] "RemoveContainer" containerID="248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a" Oct 10 07:11:23 crc kubenswrapper[4822]: E1010 07:11:23.495116 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a\": container with ID starting with 248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a not found: ID does not exist" containerID="248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.495136 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a"} err="failed to get container status \"248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a\": rpc error: code = NotFound desc = could not find container \"248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a\": container with ID starting with 248a134beb12f4880d29449c64d9469cd0832cfc426b537bc185c1c22493853a not found: ID does not exist" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.583935 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.583984 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.626225 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:23 crc kubenswrapper[4822]: I1010 07:11:23.676330 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" path="/var/lib/kubelet/pods/c50e2f47-65d7-4a2e-a94c-245993f5d66e/volumes" Oct 10 07:11:24 crc kubenswrapper[4822]: I1010 07:11:24.482236 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:25 crc kubenswrapper[4822]: I1010 07:11:25.859278 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txfkb"] Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.445606 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txfkb" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="registry-server" containerID="cri-o://69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b" gracePeriod=2 Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.860577 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.958828 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-utilities\") pod \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.958929 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-catalog-content\") pod \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.958989 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g48f8\" (UniqueName: \"kubernetes.io/projected/201602ae-8b16-4f7a-96a0-a01ce3790ed2-kube-api-access-g48f8\") pod \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\" (UID: \"201602ae-8b16-4f7a-96a0-a01ce3790ed2\") " Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.959700 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-utilities" (OuterVolumeSpecName: "utilities") pod "201602ae-8b16-4f7a-96a0-a01ce3790ed2" (UID: "201602ae-8b16-4f7a-96a0-a01ce3790ed2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.968517 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201602ae-8b16-4f7a-96a0-a01ce3790ed2-kube-api-access-g48f8" (OuterVolumeSpecName: "kube-api-access-g48f8") pod "201602ae-8b16-4f7a-96a0-a01ce3790ed2" (UID: "201602ae-8b16-4f7a-96a0-a01ce3790ed2"). InnerVolumeSpecName "kube-api-access-g48f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:26 crc kubenswrapper[4822]: I1010 07:11:26.973466 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "201602ae-8b16-4f7a-96a0-a01ce3790ed2" (UID: "201602ae-8b16-4f7a-96a0-a01ce3790ed2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.060695 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.060741 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201602ae-8b16-4f7a-96a0-a01ce3790ed2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.060753 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g48f8\" (UniqueName: \"kubernetes.io/projected/201602ae-8b16-4f7a-96a0-a01ce3790ed2-kube-api-access-g48f8\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.226688 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.272919 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.455736 4822 generic.go:334] "Generic (PLEG): container finished" podID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerID="69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b" exitCode=0 Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.455967 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txfkb" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.456240 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txfkb" event={"ID":"201602ae-8b16-4f7a-96a0-a01ce3790ed2","Type":"ContainerDied","Data":"69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b"} Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.456477 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txfkb" event={"ID":"201602ae-8b16-4f7a-96a0-a01ce3790ed2","Type":"ContainerDied","Data":"1ee25f47a71ebb330bb7954a43afc8f3a50c2835ac07e6792b84c45be294b7b9"} Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.456635 4822 scope.go:117] "RemoveContainer" containerID="69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.488712 4822 scope.go:117] "RemoveContainer" containerID="103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.507890 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txfkb"] Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.523851 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txfkb"] Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.529163 4822 scope.go:117] "RemoveContainer" containerID="613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.552386 4822 scope.go:117] "RemoveContainer" containerID="69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b" Oct 10 07:11:27 crc kubenswrapper[4822]: E1010 07:11:27.552987 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b\": container with ID starting with 69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b not found: ID does not exist" containerID="69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.553118 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b"} err="failed to get container status \"69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b\": rpc error: code = NotFound desc = could not find container \"69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b\": container with ID starting with 69f00394c3d091161c8d8120d594a90a5bf97413ff6413a45b7931be5d64284b not found: ID does not exist" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.553214 4822 scope.go:117] "RemoveContainer" containerID="103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef" Oct 10 07:11:27 crc kubenswrapper[4822]: E1010 07:11:27.553900 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef\": container with ID starting with 103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef not found: ID does not exist" containerID="103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.553939 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef"} err="failed to get container status \"103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef\": rpc error: code = NotFound desc = could not find container \"103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef\": container with ID starting with 103dd2d1bf1871966b89e56b524ec7a4bd4a1a459ba028bedb3123094d8592ef not found: ID does not exist" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.553962 4822 scope.go:117] "RemoveContainer" containerID="613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a" Oct 10 07:11:27 crc kubenswrapper[4822]: E1010 07:11:27.554223 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a\": container with ID starting with 613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a not found: ID does not exist" containerID="613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.554256 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a"} err="failed to get container status \"613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a\": rpc error: code = NotFound desc = could not find container \"613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a\": container with ID starting with 613b5ea2d2faa415c37e4def5ec62a04e5fa8c1c48b6ef720be86131d90ea38a not found: ID does not exist" Oct 10 07:11:27 crc kubenswrapper[4822]: I1010 07:11:27.662553 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" path="/var/lib/kubelet/pods/201602ae-8b16-4f7a-96a0-a01ce3790ed2/volumes" Oct 10 07:11:29 crc kubenswrapper[4822]: I1010 07:11:29.455612 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bp469"] Oct 10 07:11:29 crc kubenswrapper[4822]: I1010 07:11:29.456206 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bp469" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="registry-server" containerID="cri-o://d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc" gracePeriod=2 Oct 10 07:11:29 crc kubenswrapper[4822]: I1010 07:11:29.891182 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.009406 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q2bn\" (UniqueName: \"kubernetes.io/projected/8a95beb7-72f9-4c83-a1a4-8615312a8940-kube-api-access-4q2bn\") pod \"8a95beb7-72f9-4c83-a1a4-8615312a8940\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.009553 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-utilities\") pod \"8a95beb7-72f9-4c83-a1a4-8615312a8940\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.009607 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-catalog-content\") pod \"8a95beb7-72f9-4c83-a1a4-8615312a8940\" (UID: \"8a95beb7-72f9-4c83-a1a4-8615312a8940\") " Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.010817 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-utilities" (OuterVolumeSpecName: "utilities") pod "8a95beb7-72f9-4c83-a1a4-8615312a8940" (UID: "8a95beb7-72f9-4c83-a1a4-8615312a8940"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.015595 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a95beb7-72f9-4c83-a1a4-8615312a8940-kube-api-access-4q2bn" (OuterVolumeSpecName: "kube-api-access-4q2bn") pod "8a95beb7-72f9-4c83-a1a4-8615312a8940" (UID: "8a95beb7-72f9-4c83-a1a4-8615312a8940"). InnerVolumeSpecName "kube-api-access-4q2bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.110633 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.110664 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q2bn\" (UniqueName: \"kubernetes.io/projected/8a95beb7-72f9-4c83-a1a4-8615312a8940-kube-api-access-4q2bn\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.110909 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a95beb7-72f9-4c83-a1a4-8615312a8940" (UID: "8a95beb7-72f9-4c83-a1a4-8615312a8940"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.212184 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a95beb7-72f9-4c83-a1a4-8615312a8940-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.483399 4822 generic.go:334] "Generic (PLEG): container finished" podID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerID="d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc" exitCode=0 Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.483452 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp469" event={"ID":"8a95beb7-72f9-4c83-a1a4-8615312a8940","Type":"ContainerDied","Data":"d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc"} Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.483508 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bp469" event={"ID":"8a95beb7-72f9-4c83-a1a4-8615312a8940","Type":"ContainerDied","Data":"72fde1b41cdb3f9894096ed9920538ff0b5c4bdc1a2495cecd7f3681551bce09"} Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.483510 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bp469" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.483539 4822 scope.go:117] "RemoveContainer" containerID="d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.527934 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bp469"] Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.532761 4822 scope.go:117] "RemoveContainer" containerID="3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.534293 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bp469"] Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.555357 4822 scope.go:117] "RemoveContainer" containerID="0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.582941 4822 scope.go:117] "RemoveContainer" containerID="d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc" Oct 10 07:11:30 crc kubenswrapper[4822]: E1010 07:11:30.583481 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc\": container with ID starting with d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc not found: ID does not exist" containerID="d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.583605 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc"} err="failed to get container status \"d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc\": rpc error: code = NotFound desc = could not find container \"d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc\": container with ID starting with d1ea8842ffb1bc1fc47cb0c375f18b3bcdecb78de06526b0e8de93a75f05f1bc not found: ID does not exist" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.583686 4822 scope.go:117] "RemoveContainer" containerID="3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32" Oct 10 07:11:30 crc kubenswrapper[4822]: E1010 07:11:30.584117 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32\": container with ID starting with 3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32 not found: ID does not exist" containerID="3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.584264 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32"} err="failed to get container status \"3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32\": rpc error: code = NotFound desc = could not find container \"3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32\": container with ID starting with 3cc5b6412cd927417787c24048230fdb5be5c7c51db9a57d9b4c0c2065ac4b32 not found: ID does not exist" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.584407 4822 scope.go:117] "RemoveContainer" containerID="0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87" Oct 10 07:11:30 crc kubenswrapper[4822]: E1010 07:11:30.584708 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87\": container with ID starting with 0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87 not found: ID does not exist" containerID="0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87" Oct 10 07:11:30 crc kubenswrapper[4822]: I1010 07:11:30.584746 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87"} err="failed to get container status \"0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87\": rpc error: code = NotFound desc = could not find container \"0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87\": container with ID starting with 0161ce95d90cdeac3c1f36a23362e887db27b616081ae8cf4caff6304c32cf87 not found: ID does not exist" Oct 10 07:11:31 crc kubenswrapper[4822]: I1010 07:11:31.662568 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" path="/var/lib/kubelet/pods/8a95beb7-72f9-4c83-a1a4-8615312a8940/volumes" Oct 10 07:12:01 crc kubenswrapper[4822]: I1010 07:12:01.337262 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:12:01 crc kubenswrapper[4822]: I1010 07:12:01.337875 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:12:31 crc kubenswrapper[4822]: I1010 07:12:31.337114 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:12:31 crc kubenswrapper[4822]: I1010 07:12:31.338010 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:13:01 crc kubenswrapper[4822]: I1010 07:13:01.337296 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:13:01 crc kubenswrapper[4822]: I1010 07:13:01.337987 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:13:01 crc kubenswrapper[4822]: I1010 07:13:01.338096 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:13:01 crc kubenswrapper[4822]: I1010 07:13:01.338988 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:13:01 crc kubenswrapper[4822]: I1010 07:13:01.339045 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" gracePeriod=600 Oct 10 07:13:01 crc kubenswrapper[4822]: E1010 07:13:01.462297 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:13:02 crc kubenswrapper[4822]: I1010 07:13:02.217344 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" exitCode=0 Oct 10 07:13:02 crc kubenswrapper[4822]: I1010 07:13:02.217392 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c"} Oct 10 07:13:02 crc kubenswrapper[4822]: I1010 07:13:02.217428 4822 scope.go:117] "RemoveContainer" containerID="7e2bd9dbbe06c39b597a6734c6299d41102040c49b75b4afb9dc3ad5c1a4bfad" Oct 10 07:13:02 crc kubenswrapper[4822]: I1010 07:13:02.218498 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:13:02 crc kubenswrapper[4822]: E1010 07:13:02.219024 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:13:14 crc kubenswrapper[4822]: I1010 07:13:14.650504 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:13:14 crc kubenswrapper[4822]: E1010 07:13:14.651329 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:13:25 crc kubenswrapper[4822]: I1010 07:13:25.650962 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:13:25 crc kubenswrapper[4822]: E1010 07:13:25.653263 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:13:39 crc kubenswrapper[4822]: I1010 07:13:39.651098 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:13:39 crc kubenswrapper[4822]: E1010 07:13:39.652165 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:13:50 crc kubenswrapper[4822]: I1010 07:13:50.650387 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:13:50 crc kubenswrapper[4822]: E1010 07:13:50.651902 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:14:04 crc kubenswrapper[4822]: I1010 07:14:04.650528 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:14:04 crc kubenswrapper[4822]: E1010 07:14:04.651511 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:14:17 crc kubenswrapper[4822]: I1010 07:14:17.650099 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:14:17 crc kubenswrapper[4822]: E1010 07:14:17.650733 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:14:32 crc kubenswrapper[4822]: I1010 07:14:32.650548 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:14:32 crc kubenswrapper[4822]: E1010 07:14:32.651252 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:14:43 crc kubenswrapper[4822]: I1010 07:14:43.656400 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:14:43 crc kubenswrapper[4822]: E1010 07:14:43.657489 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:14:56 crc kubenswrapper[4822]: I1010 07:14:56.651390 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:14:56 crc kubenswrapper[4822]: E1010 07:14:56.652691 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.197982 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr"] Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198694 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198712 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198730 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198737 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198754 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198762 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198774 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198781 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198797 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198822 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198836 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198842 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198857 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198863 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198871 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198877 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: E1010 07:15:00.198892 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.198898 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.199045 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c50e2f47-65d7-4a2e-a94c-245993f5d66e" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.199057 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a95beb7-72f9-4c83-a1a4-8615312a8940" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.199075 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="201602ae-8b16-4f7a-96a0-a01ce3790ed2" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.199589 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.201678 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.201866 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.214059 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr"] Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.259960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gx2c\" (UniqueName: \"kubernetes.io/projected/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-kube-api-access-9gx2c\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.260009 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-config-volume\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.260284 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-secret-volume\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.361478 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-secret-volume\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.361554 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gx2c\" (UniqueName: \"kubernetes.io/projected/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-kube-api-access-9gx2c\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.361589 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-config-volume\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.362639 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-config-volume\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.370537 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-secret-volume\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.378702 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gx2c\" (UniqueName: \"kubernetes.io/projected/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-kube-api-access-9gx2c\") pod \"collect-profiles-29334675-78jhr\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.521733 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:00 crc kubenswrapper[4822]: I1010 07:15:00.939079 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr"] Oct 10 07:15:01 crc kubenswrapper[4822]: I1010 07:15:01.180265 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" event={"ID":"e9dc9779-cb71-4877-9f3a-88ab1a805bb2","Type":"ContainerStarted","Data":"1822d51afeabbd58543b9b625375c30e8d2f940b6ba6af8d2b69373209e3a4dc"} Oct 10 07:15:01 crc kubenswrapper[4822]: I1010 07:15:01.180633 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" event={"ID":"e9dc9779-cb71-4877-9f3a-88ab1a805bb2","Type":"ContainerStarted","Data":"52049a05fadb2a3a131117453ac3946992f794124315b1af3c86bfc7780d27a3"} Oct 10 07:15:01 crc kubenswrapper[4822]: I1010 07:15:01.207161 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" podStartSLOduration=1.207142518 podStartE2EDuration="1.207142518s" podCreationTimestamp="2025-10-10 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:15:01.204718968 +0000 UTC m=+3048.299877164" watchObservedRunningTime="2025-10-10 07:15:01.207142518 +0000 UTC m=+3048.302300714" Oct 10 07:15:02 crc kubenswrapper[4822]: I1010 07:15:02.193309 4822 generic.go:334] "Generic (PLEG): container finished" podID="e9dc9779-cb71-4877-9f3a-88ab1a805bb2" containerID="1822d51afeabbd58543b9b625375c30e8d2f940b6ba6af8d2b69373209e3a4dc" exitCode=0 Oct 10 07:15:02 crc kubenswrapper[4822]: I1010 07:15:02.193380 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" event={"ID":"e9dc9779-cb71-4877-9f3a-88ab1a805bb2","Type":"ContainerDied","Data":"1822d51afeabbd58543b9b625375c30e8d2f940b6ba6af8d2b69373209e3a4dc"} Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.481631 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.609830 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-secret-volume\") pod \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.609892 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gx2c\" (UniqueName: \"kubernetes.io/projected/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-kube-api-access-9gx2c\") pod \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.610017 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-config-volume\") pod \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\" (UID: \"e9dc9779-cb71-4877-9f3a-88ab1a805bb2\") " Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.610673 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-config-volume" (OuterVolumeSpecName: "config-volume") pod "e9dc9779-cb71-4877-9f3a-88ab1a805bb2" (UID: "e9dc9779-cb71-4877-9f3a-88ab1a805bb2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.615701 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-kube-api-access-9gx2c" (OuterVolumeSpecName: "kube-api-access-9gx2c") pod "e9dc9779-cb71-4877-9f3a-88ab1a805bb2" (UID: "e9dc9779-cb71-4877-9f3a-88ab1a805bb2"). InnerVolumeSpecName "kube-api-access-9gx2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.616304 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e9dc9779-cb71-4877-9f3a-88ab1a805bb2" (UID: "e9dc9779-cb71-4877-9f3a-88ab1a805bb2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.711952 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.711989 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:03 crc kubenswrapper[4822]: I1010 07:15:03.711999 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gx2c\" (UniqueName: \"kubernetes.io/projected/e9dc9779-cb71-4877-9f3a-88ab1a805bb2-kube-api-access-9gx2c\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:04 crc kubenswrapper[4822]: I1010 07:15:04.216112 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" event={"ID":"e9dc9779-cb71-4877-9f3a-88ab1a805bb2","Type":"ContainerDied","Data":"52049a05fadb2a3a131117453ac3946992f794124315b1af3c86bfc7780d27a3"} Oct 10 07:15:04 crc kubenswrapper[4822]: I1010 07:15:04.216158 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52049a05fadb2a3a131117453ac3946992f794124315b1af3c86bfc7780d27a3" Oct 10 07:15:04 crc kubenswrapper[4822]: I1010 07:15:04.216129 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr" Oct 10 07:15:04 crc kubenswrapper[4822]: I1010 07:15:04.283710 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk"] Oct 10 07:15:04 crc kubenswrapper[4822]: I1010 07:15:04.289664 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334630-zclqk"] Oct 10 07:15:05 crc kubenswrapper[4822]: I1010 07:15:05.661872 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70328244-19ca-4109-91cf-092435c7485c" path="/var/lib/kubelet/pods/70328244-19ca-4109-91cf-092435c7485c/volumes" Oct 10 07:15:09 crc kubenswrapper[4822]: I1010 07:15:09.650889 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:15:09 crc kubenswrapper[4822]: E1010 07:15:09.652023 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:15:21 crc kubenswrapper[4822]: I1010 07:15:21.742640 4822 scope.go:117] "RemoveContainer" containerID="a229f0517e627499c510ad3b7f4c030846600b19d5301ea3c6a7e4ea5ca71156" Oct 10 07:15:23 crc kubenswrapper[4822]: I1010 07:15:23.664516 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:15:23 crc kubenswrapper[4822]: E1010 07:15:23.666262 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:15:34 crc kubenswrapper[4822]: I1010 07:15:34.649917 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:15:34 crc kubenswrapper[4822]: E1010 07:15:34.650613 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:15:46 crc kubenswrapper[4822]: I1010 07:15:46.650488 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:15:46 crc kubenswrapper[4822]: E1010 07:15:46.651309 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:15:59 crc kubenswrapper[4822]: I1010 07:15:59.650975 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:15:59 crc kubenswrapper[4822]: E1010 07:15:59.651843 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:16:14 crc kubenswrapper[4822]: I1010 07:16:14.650386 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:16:14 crc kubenswrapper[4822]: E1010 07:16:14.651304 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:16:27 crc kubenswrapper[4822]: I1010 07:16:27.652613 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:16:27 crc kubenswrapper[4822]: E1010 07:16:27.653381 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:16:39 crc kubenswrapper[4822]: I1010 07:16:39.651078 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:16:39 crc kubenswrapper[4822]: E1010 07:16:39.653207 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:16:53 crc kubenswrapper[4822]: I1010 07:16:53.658408 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:16:53 crc kubenswrapper[4822]: E1010 07:16:53.659914 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:17:04 crc kubenswrapper[4822]: I1010 07:17:04.651242 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:17:04 crc kubenswrapper[4822]: E1010 07:17:04.652134 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:17:18 crc kubenswrapper[4822]: I1010 07:17:18.650570 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:17:18 crc kubenswrapper[4822]: E1010 07:17:18.651442 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:17:33 crc kubenswrapper[4822]: I1010 07:17:33.655287 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:17:33 crc kubenswrapper[4822]: E1010 07:17:33.656112 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:17:45 crc kubenswrapper[4822]: I1010 07:17:45.650500 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:17:45 crc kubenswrapper[4822]: E1010 07:17:45.652541 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:17:59 crc kubenswrapper[4822]: I1010 07:17:59.650951 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:17:59 crc kubenswrapper[4822]: E1010 07:17:59.652609 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:18:11 crc kubenswrapper[4822]: I1010 07:18:11.650189 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:18:12 crc kubenswrapper[4822]: I1010 07:18:12.742419 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"2fba85c42d900de94cc2de960a8c0e1af30733ca35d7cc574ce9ab5a7370b460"} Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.819153 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h2wmx"] Oct 10 07:20:03 crc kubenswrapper[4822]: E1010 07:20:03.820135 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9dc9779-cb71-4877-9f3a-88ab1a805bb2" containerName="collect-profiles" Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.820154 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9dc9779-cb71-4877-9f3a-88ab1a805bb2" containerName="collect-profiles" Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.820498 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9dc9779-cb71-4877-9f3a-88ab1a805bb2" containerName="collect-profiles" Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.822754 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.848007 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2wmx"] Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.949287 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-utilities\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.949475 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvv5\" (UniqueName: \"kubernetes.io/projected/8cc61585-f210-4a8c-b8aa-b686f3d533eb-kube-api-access-fzvv5\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:03 crc kubenswrapper[4822]: I1010 07:20:03.949540 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-catalog-content\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.051306 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvv5\" (UniqueName: \"kubernetes.io/projected/8cc61585-f210-4a8c-b8aa-b686f3d533eb-kube-api-access-fzvv5\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.051428 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-catalog-content\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.051544 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-utilities\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.052476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-catalog-content\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.052523 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-utilities\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.073399 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvv5\" (UniqueName: \"kubernetes.io/projected/8cc61585-f210-4a8c-b8aa-b686f3d533eb-kube-api-access-fzvv5\") pod \"community-operators-h2wmx\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.147506 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.659322 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h2wmx"] Oct 10 07:20:04 crc kubenswrapper[4822]: I1010 07:20:04.702715 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2wmx" event={"ID":"8cc61585-f210-4a8c-b8aa-b686f3d533eb","Type":"ContainerStarted","Data":"b826b3cfa1bdc0028ff9727d937e23d8547e7ad8d595385e645a34fb25e25f03"} Oct 10 07:20:05 crc kubenswrapper[4822]: I1010 07:20:05.711425 4822 generic.go:334] "Generic (PLEG): container finished" podID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerID="b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9" exitCode=0 Oct 10 07:20:05 crc kubenswrapper[4822]: I1010 07:20:05.711530 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2wmx" event={"ID":"8cc61585-f210-4a8c-b8aa-b686f3d533eb","Type":"ContainerDied","Data":"b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9"} Oct 10 07:20:05 crc kubenswrapper[4822]: I1010 07:20:05.714460 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:20:06 crc kubenswrapper[4822]: I1010 07:20:06.721605 4822 generic.go:334] "Generic (PLEG): container finished" podID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerID="4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f" exitCode=0 Oct 10 07:20:06 crc kubenswrapper[4822]: I1010 07:20:06.721669 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2wmx" event={"ID":"8cc61585-f210-4a8c-b8aa-b686f3d533eb","Type":"ContainerDied","Data":"4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f"} Oct 10 07:20:07 crc kubenswrapper[4822]: I1010 07:20:07.733737 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2wmx" event={"ID":"8cc61585-f210-4a8c-b8aa-b686f3d533eb","Type":"ContainerStarted","Data":"a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d"} Oct 10 07:20:07 crc kubenswrapper[4822]: I1010 07:20:07.760447 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h2wmx" podStartSLOduration=3.197042119 podStartE2EDuration="4.760409032s" podCreationTimestamp="2025-10-10 07:20:03 +0000 UTC" firstStartedPulling="2025-10-10 07:20:05.714251299 +0000 UTC m=+3352.809409495" lastFinishedPulling="2025-10-10 07:20:07.277618172 +0000 UTC m=+3354.372776408" observedRunningTime="2025-10-10 07:20:07.747303802 +0000 UTC m=+3354.842462058" watchObservedRunningTime="2025-10-10 07:20:07.760409032 +0000 UTC m=+3354.855567268" Oct 10 07:20:14 crc kubenswrapper[4822]: I1010 07:20:14.148609 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:14 crc kubenswrapper[4822]: I1010 07:20:14.150140 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:14 crc kubenswrapper[4822]: I1010 07:20:14.196699 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:14 crc kubenswrapper[4822]: I1010 07:20:14.833165 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:14 crc kubenswrapper[4822]: I1010 07:20:14.879168 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2wmx"] Oct 10 07:20:16 crc kubenswrapper[4822]: I1010 07:20:16.810681 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h2wmx" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="registry-server" containerID="cri-o://a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d" gracePeriod=2 Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.341834 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.445510 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-utilities\") pod \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.445685 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzvv5\" (UniqueName: \"kubernetes.io/projected/8cc61585-f210-4a8c-b8aa-b686f3d533eb-kube-api-access-fzvv5\") pod \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.445835 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-catalog-content\") pod \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\" (UID: \"8cc61585-f210-4a8c-b8aa-b686f3d533eb\") " Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.446899 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-utilities" (OuterVolumeSpecName: "utilities") pod "8cc61585-f210-4a8c-b8aa-b686f3d533eb" (UID: "8cc61585-f210-4a8c-b8aa-b686f3d533eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.452288 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc61585-f210-4a8c-b8aa-b686f3d533eb-kube-api-access-fzvv5" (OuterVolumeSpecName: "kube-api-access-fzvv5") pod "8cc61585-f210-4a8c-b8aa-b686f3d533eb" (UID: "8cc61585-f210-4a8c-b8aa-b686f3d533eb"). InnerVolumeSpecName "kube-api-access-fzvv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.548121 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.548182 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzvv5\" (UniqueName: \"kubernetes.io/projected/8cc61585-f210-4a8c-b8aa-b686f3d533eb-kube-api-access-fzvv5\") on node \"crc\" DevicePath \"\"" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.776346 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cc61585-f210-4a8c-b8aa-b686f3d533eb" (UID: "8cc61585-f210-4a8c-b8aa-b686f3d533eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.820001 4822 generic.go:334] "Generic (PLEG): container finished" podID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerID="a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d" exitCode=0 Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.820051 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2wmx" event={"ID":"8cc61585-f210-4a8c-b8aa-b686f3d533eb","Type":"ContainerDied","Data":"a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d"} Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.820082 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h2wmx" event={"ID":"8cc61585-f210-4a8c-b8aa-b686f3d533eb","Type":"ContainerDied","Data":"b826b3cfa1bdc0028ff9727d937e23d8547e7ad8d595385e645a34fb25e25f03"} Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.820094 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h2wmx" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.820104 4822 scope.go:117] "RemoveContainer" containerID="a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.853937 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc61585-f210-4a8c-b8aa-b686f3d533eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.861206 4822 scope.go:117] "RemoveContainer" containerID="4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.869131 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h2wmx"] Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.880245 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h2wmx"] Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.885961 4822 scope.go:117] "RemoveContainer" containerID="b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.920412 4822 scope.go:117] "RemoveContainer" containerID="a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d" Oct 10 07:20:17 crc kubenswrapper[4822]: E1010 07:20:17.921634 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d\": container with ID starting with a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d not found: ID does not exist" containerID="a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.921691 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d"} err="failed to get container status \"a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d\": rpc error: code = NotFound desc = could not find container \"a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d\": container with ID starting with a2fcb27db4b9c2d4625ce63432a1bbfd48515086d332296a128d29460b8b6c9d not found: ID does not exist" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.921737 4822 scope.go:117] "RemoveContainer" containerID="4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f" Oct 10 07:20:17 crc kubenswrapper[4822]: E1010 07:20:17.922197 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f\": container with ID starting with 4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f not found: ID does not exist" containerID="4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.922225 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f"} err="failed to get container status \"4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f\": rpc error: code = NotFound desc = could not find container \"4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f\": container with ID starting with 4a24eaef72b16061872f0632e66483e3456459555d1deafa0c112acd6dd9d56f not found: ID does not exist" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.922249 4822 scope.go:117] "RemoveContainer" containerID="b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9" Oct 10 07:20:17 crc kubenswrapper[4822]: E1010 07:20:17.923186 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9\": container with ID starting with b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9 not found: ID does not exist" containerID="b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9" Oct 10 07:20:17 crc kubenswrapper[4822]: I1010 07:20:17.923215 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9"} err="failed to get container status \"b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9\": rpc error: code = NotFound desc = could not find container \"b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9\": container with ID starting with b3aefe03ca92677e80b2a89871a92778dae2f47611eb3c4e76c3d63b64b772a9 not found: ID does not exist" Oct 10 07:20:19 crc kubenswrapper[4822]: I1010 07:20:19.677180 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" path="/var/lib/kubelet/pods/8cc61585-f210-4a8c-b8aa-b686f3d533eb/volumes" Oct 10 07:20:31 crc kubenswrapper[4822]: I1010 07:20:31.336500 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:20:31 crc kubenswrapper[4822]: I1010 07:20:31.337282 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:21:01 crc kubenswrapper[4822]: I1010 07:21:01.337185 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:21:01 crc kubenswrapper[4822]: I1010 07:21:01.338072 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:21:31 crc kubenswrapper[4822]: I1010 07:21:31.337283 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:21:31 crc kubenswrapper[4822]: I1010 07:21:31.337960 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:21:31 crc kubenswrapper[4822]: I1010 07:21:31.338024 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:21:31 crc kubenswrapper[4822]: I1010 07:21:31.338874 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fba85c42d900de94cc2de960a8c0e1af30733ca35d7cc574ce9ab5a7370b460"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:21:31 crc kubenswrapper[4822]: I1010 07:21:31.338973 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://2fba85c42d900de94cc2de960a8c0e1af30733ca35d7cc574ce9ab5a7370b460" gracePeriod=600 Oct 10 07:21:32 crc kubenswrapper[4822]: I1010 07:21:32.469571 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="2fba85c42d900de94cc2de960a8c0e1af30733ca35d7cc574ce9ab5a7370b460" exitCode=0 Oct 10 07:21:32 crc kubenswrapper[4822]: I1010 07:21:32.469669 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"2fba85c42d900de94cc2de960a8c0e1af30733ca35d7cc574ce9ab5a7370b460"} Oct 10 07:21:32 crc kubenswrapper[4822]: I1010 07:21:32.470029 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c"} Oct 10 07:21:32 crc kubenswrapper[4822]: I1010 07:21:32.470067 4822 scope.go:117] "RemoveContainer" containerID="5835cab246539ab7b186e404bce35dfaacad6447e521972d06cdb8627e598f4c" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.088287 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ws8p5"] Oct 10 07:21:37 crc kubenswrapper[4822]: E1010 07:21:37.089156 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="extract-content" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.089171 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="extract-content" Oct 10 07:21:37 crc kubenswrapper[4822]: E1010 07:21:37.089186 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="extract-utilities" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.089195 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="extract-utilities" Oct 10 07:21:37 crc kubenswrapper[4822]: E1010 07:21:37.089207 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="registry-server" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.089213 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="registry-server" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.089343 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc61585-f210-4a8c-b8aa-b686f3d533eb" containerName="registry-server" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.090350 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.098887 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ws8p5"] Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.136969 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhzl\" (UniqueName: \"kubernetes.io/projected/157c39ab-78b8-4925-a5f4-2374d3817f99-kube-api-access-gqhzl\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.137061 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-catalog-content\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.137104 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-utilities\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.238987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhzl\" (UniqueName: \"kubernetes.io/projected/157c39ab-78b8-4925-a5f4-2374d3817f99-kube-api-access-gqhzl\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.239066 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-catalog-content\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.239106 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-utilities\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.239669 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-utilities\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.240264 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-catalog-content\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.264315 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhzl\" (UniqueName: \"kubernetes.io/projected/157c39ab-78b8-4925-a5f4-2374d3817f99-kube-api-access-gqhzl\") pod \"redhat-marketplace-ws8p5\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.418677 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:37 crc kubenswrapper[4822]: I1010 07:21:37.850379 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ws8p5"] Oct 10 07:21:38 crc kubenswrapper[4822]: I1010 07:21:38.533255 4822 generic.go:334] "Generic (PLEG): container finished" podID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerID="babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78" exitCode=0 Oct 10 07:21:38 crc kubenswrapper[4822]: I1010 07:21:38.533309 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerDied","Data":"babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78"} Oct 10 07:21:38 crc kubenswrapper[4822]: I1010 07:21:38.533565 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerStarted","Data":"8b43787eef769073194ea21aab9368198ac2e3864e04ad4c1d41269593050eb9"} Oct 10 07:21:39 crc kubenswrapper[4822]: I1010 07:21:39.544256 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerStarted","Data":"ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d"} Oct 10 07:21:40 crc kubenswrapper[4822]: I1010 07:21:40.554790 4822 generic.go:334] "Generic (PLEG): container finished" podID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerID="ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d" exitCode=0 Oct 10 07:21:40 crc kubenswrapper[4822]: I1010 07:21:40.554911 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerDied","Data":"ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d"} Oct 10 07:21:41 crc kubenswrapper[4822]: I1010 07:21:41.575610 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerStarted","Data":"fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25"} Oct 10 07:21:41 crc kubenswrapper[4822]: I1010 07:21:41.599762 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ws8p5" podStartSLOduration=2.090792751 podStartE2EDuration="4.59974701s" podCreationTimestamp="2025-10-10 07:21:37 +0000 UTC" firstStartedPulling="2025-10-10 07:21:38.535890693 +0000 UTC m=+3445.631048889" lastFinishedPulling="2025-10-10 07:21:41.044844942 +0000 UTC m=+3448.140003148" observedRunningTime="2025-10-10 07:21:41.59250989 +0000 UTC m=+3448.687668096" watchObservedRunningTime="2025-10-10 07:21:41.59974701 +0000 UTC m=+3448.694905206" Oct 10 07:21:47 crc kubenswrapper[4822]: I1010 07:21:47.419116 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:47 crc kubenswrapper[4822]: I1010 07:21:47.419776 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:47 crc kubenswrapper[4822]: I1010 07:21:47.500187 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:47 crc kubenswrapper[4822]: I1010 07:21:47.688115 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:47 crc kubenswrapper[4822]: I1010 07:21:47.747596 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ws8p5"] Oct 10 07:21:49 crc kubenswrapper[4822]: I1010 07:21:49.651070 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ws8p5" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="registry-server" containerID="cri-o://fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25" gracePeriod=2 Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.111859 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.236379 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqhzl\" (UniqueName: \"kubernetes.io/projected/157c39ab-78b8-4925-a5f4-2374d3817f99-kube-api-access-gqhzl\") pod \"157c39ab-78b8-4925-a5f4-2374d3817f99\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.236458 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-utilities\") pod \"157c39ab-78b8-4925-a5f4-2374d3817f99\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.236626 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-catalog-content\") pod \"157c39ab-78b8-4925-a5f4-2374d3817f99\" (UID: \"157c39ab-78b8-4925-a5f4-2374d3817f99\") " Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.237916 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-utilities" (OuterVolumeSpecName: "utilities") pod "157c39ab-78b8-4925-a5f4-2374d3817f99" (UID: "157c39ab-78b8-4925-a5f4-2374d3817f99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.246477 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157c39ab-78b8-4925-a5f4-2374d3817f99-kube-api-access-gqhzl" (OuterVolumeSpecName: "kube-api-access-gqhzl") pod "157c39ab-78b8-4925-a5f4-2374d3817f99" (UID: "157c39ab-78b8-4925-a5f4-2374d3817f99"). InnerVolumeSpecName "kube-api-access-gqhzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.250799 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "157c39ab-78b8-4925-a5f4-2374d3817f99" (UID: "157c39ab-78b8-4925-a5f4-2374d3817f99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.338826 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.338879 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqhzl\" (UniqueName: \"kubernetes.io/projected/157c39ab-78b8-4925-a5f4-2374d3817f99-kube-api-access-gqhzl\") on node \"crc\" DevicePath \"\"" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.338902 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157c39ab-78b8-4925-a5f4-2374d3817f99-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.664898 4822 generic.go:334] "Generic (PLEG): container finished" podID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerID="fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25" exitCode=0 Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.664993 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ws8p5" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.664963 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerDied","Data":"fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25"} Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.665192 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ws8p5" event={"ID":"157c39ab-78b8-4925-a5f4-2374d3817f99","Type":"ContainerDied","Data":"8b43787eef769073194ea21aab9368198ac2e3864e04ad4c1d41269593050eb9"} Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.665227 4822 scope.go:117] "RemoveContainer" containerID="fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.691409 4822 scope.go:117] "RemoveContainer" containerID="ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.702442 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ws8p5"] Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.708623 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ws8p5"] Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.723321 4822 scope.go:117] "RemoveContainer" containerID="babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.743789 4822 scope.go:117] "RemoveContainer" containerID="fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25" Oct 10 07:21:50 crc kubenswrapper[4822]: E1010 07:21:50.744285 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25\": container with ID starting with fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25 not found: ID does not exist" containerID="fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.744430 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25"} err="failed to get container status \"fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25\": rpc error: code = NotFound desc = could not find container \"fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25\": container with ID starting with fa274e333dd69d6f304bcd93043707161a199ed998ab555ec2964f1dc64dbb25 not found: ID does not exist" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.744533 4822 scope.go:117] "RemoveContainer" containerID="ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d" Oct 10 07:21:50 crc kubenswrapper[4822]: E1010 07:21:50.745363 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d\": container with ID starting with ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d not found: ID does not exist" containerID="ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.745414 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d"} err="failed to get container status \"ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d\": rpc error: code = NotFound desc = could not find container \"ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d\": container with ID starting with ac852540ea0c6102c304b5018b64dbb0baa580b58ace476c8166870bb3f78b9d not found: ID does not exist" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.745447 4822 scope.go:117] "RemoveContainer" containerID="babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78" Oct 10 07:21:50 crc kubenswrapper[4822]: E1010 07:21:50.745733 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78\": container with ID starting with babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78 not found: ID does not exist" containerID="babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78" Oct 10 07:21:50 crc kubenswrapper[4822]: I1010 07:21:50.745871 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78"} err="failed to get container status \"babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78\": rpc error: code = NotFound desc = could not find container \"babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78\": container with ID starting with babdae4bd1b635abe2028aa0dde25c3cbc2ae8158fccbe3f56bc6b717083bd78 not found: ID does not exist" Oct 10 07:21:51 crc kubenswrapper[4822]: I1010 07:21:51.662510 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" path="/var/lib/kubelet/pods/157c39ab-78b8-4925-a5f4-2374d3817f99/volumes" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.362800 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95z7x"] Oct 10 07:21:52 crc kubenswrapper[4822]: E1010 07:21:52.363363 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="extract-content" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.363394 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="extract-content" Oct 10 07:21:52 crc kubenswrapper[4822]: E1010 07:21:52.364366 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="registry-server" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.364553 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="registry-server" Oct 10 07:21:52 crc kubenswrapper[4822]: E1010 07:21:52.364629 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="extract-utilities" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.364667 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="extract-utilities" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.368446 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="157c39ab-78b8-4925-a5f4-2374d3817f99" containerName="registry-server" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.375885 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.384567 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95z7x"] Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.568779 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-utilities\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.569437 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-catalog-content\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.569625 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5plb4\" (UniqueName: \"kubernetes.io/projected/c5ad5ef3-5771-4170-93af-8e95d561efe7-kube-api-access-5plb4\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.685263 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5plb4\" (UniqueName: \"kubernetes.io/projected/c5ad5ef3-5771-4170-93af-8e95d561efe7-kube-api-access-5plb4\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.685547 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-utilities\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.685775 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-catalog-content\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.686553 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-catalog-content\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.689380 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-utilities\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.709716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5plb4\" (UniqueName: \"kubernetes.io/projected/c5ad5ef3-5771-4170-93af-8e95d561efe7-kube-api-access-5plb4\") pod \"redhat-operators-95z7x\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:52 crc kubenswrapper[4822]: I1010 07:21:52.717459 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:21:53 crc kubenswrapper[4822]: I1010 07:21:53.178354 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95z7x"] Oct 10 07:21:53 crc kubenswrapper[4822]: I1010 07:21:53.695947 4822 generic.go:334] "Generic (PLEG): container finished" podID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerID="9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd" exitCode=0 Oct 10 07:21:53 crc kubenswrapper[4822]: I1010 07:21:53.696794 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerDied","Data":"9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd"} Oct 10 07:21:53 crc kubenswrapper[4822]: I1010 07:21:53.697003 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerStarted","Data":"c60db7c95884a1cf345a5e890854aa381ea629a906585b7b96082d0a5428712d"} Oct 10 07:21:54 crc kubenswrapper[4822]: I1010 07:21:54.709277 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerStarted","Data":"a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7"} Oct 10 07:21:55 crc kubenswrapper[4822]: I1010 07:21:55.722769 4822 generic.go:334] "Generic (PLEG): container finished" podID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerID="a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7" exitCode=0 Oct 10 07:21:55 crc kubenswrapper[4822]: I1010 07:21:55.722895 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerDied","Data":"a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7"} Oct 10 07:21:56 crc kubenswrapper[4822]: I1010 07:21:56.737109 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerStarted","Data":"89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310"} Oct 10 07:21:56 crc kubenswrapper[4822]: I1010 07:21:56.761139 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95z7x" podStartSLOduration=2.301760644 podStartE2EDuration="4.761108193s" podCreationTimestamp="2025-10-10 07:21:52 +0000 UTC" firstStartedPulling="2025-10-10 07:21:53.699006147 +0000 UTC m=+3460.794164363" lastFinishedPulling="2025-10-10 07:21:56.158353676 +0000 UTC m=+3463.253511912" observedRunningTime="2025-10-10 07:21:56.759458285 +0000 UTC m=+3463.854616511" watchObservedRunningTime="2025-10-10 07:21:56.761108193 +0000 UTC m=+3463.856266439" Oct 10 07:22:02 crc kubenswrapper[4822]: I1010 07:22:02.718408 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:22:02 crc kubenswrapper[4822]: I1010 07:22:02.718780 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:22:02 crc kubenswrapper[4822]: I1010 07:22:02.798637 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:22:02 crc kubenswrapper[4822]: I1010 07:22:02.866669 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:22:03 crc kubenswrapper[4822]: I1010 07:22:03.040963 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95z7x"] Oct 10 07:22:04 crc kubenswrapper[4822]: I1010 07:22:04.810259 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95z7x" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="registry-server" containerID="cri-o://89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310" gracePeriod=2 Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.612541 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.796146 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-utilities\") pod \"c5ad5ef3-5771-4170-93af-8e95d561efe7\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.796212 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5plb4\" (UniqueName: \"kubernetes.io/projected/c5ad5ef3-5771-4170-93af-8e95d561efe7-kube-api-access-5plb4\") pod \"c5ad5ef3-5771-4170-93af-8e95d561efe7\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.796348 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-catalog-content\") pod \"c5ad5ef3-5771-4170-93af-8e95d561efe7\" (UID: \"c5ad5ef3-5771-4170-93af-8e95d561efe7\") " Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.797405 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-utilities" (OuterVolumeSpecName: "utilities") pod "c5ad5ef3-5771-4170-93af-8e95d561efe7" (UID: "c5ad5ef3-5771-4170-93af-8e95d561efe7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.802098 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ad5ef3-5771-4170-93af-8e95d561efe7-kube-api-access-5plb4" (OuterVolumeSpecName: "kube-api-access-5plb4") pod "c5ad5ef3-5771-4170-93af-8e95d561efe7" (UID: "c5ad5ef3-5771-4170-93af-8e95d561efe7"). InnerVolumeSpecName "kube-api-access-5plb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.831352 4822 generic.go:334] "Generic (PLEG): container finished" podID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerID="89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310" exitCode=0 Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.831398 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerDied","Data":"89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310"} Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.831428 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95z7x" event={"ID":"c5ad5ef3-5771-4170-93af-8e95d561efe7","Type":"ContainerDied","Data":"c60db7c95884a1cf345a5e890854aa381ea629a906585b7b96082d0a5428712d"} Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.831446 4822 scope.go:117] "RemoveContainer" containerID="89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.831450 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95z7x" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.852641 4822 scope.go:117] "RemoveContainer" containerID="a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.870259 4822 scope.go:117] "RemoveContainer" containerID="9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.893664 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5ad5ef3-5771-4170-93af-8e95d561efe7" (UID: "c5ad5ef3-5771-4170-93af-8e95d561efe7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.898420 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.898479 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ad5ef3-5771-4170-93af-8e95d561efe7-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.898497 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5plb4\" (UniqueName: \"kubernetes.io/projected/c5ad5ef3-5771-4170-93af-8e95d561efe7-kube-api-access-5plb4\") on node \"crc\" DevicePath \"\"" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.900472 4822 scope.go:117] "RemoveContainer" containerID="89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310" Oct 10 07:22:06 crc kubenswrapper[4822]: E1010 07:22:06.901059 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310\": container with ID starting with 89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310 not found: ID does not exist" containerID="89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.901109 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310"} err="failed to get container status \"89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310\": rpc error: code = NotFound desc = could not find container \"89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310\": container with ID starting with 89e4ce49dc68dd8da00cda989ad6c57cc97a94ddc3feb86ddb5d3552bc036310 not found: ID does not exist" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.901139 4822 scope.go:117] "RemoveContainer" containerID="a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7" Oct 10 07:22:06 crc kubenswrapper[4822]: E1010 07:22:06.901501 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7\": container with ID starting with a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7 not found: ID does not exist" containerID="a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.901535 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7"} err="failed to get container status \"a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7\": rpc error: code = NotFound desc = could not find container \"a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7\": container with ID starting with a88c2646dc804acc57988499218285a8eb30354707c261ea9df6b9cd89b084d7 not found: ID does not exist" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.901557 4822 scope.go:117] "RemoveContainer" containerID="9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd" Oct 10 07:22:06 crc kubenswrapper[4822]: E1010 07:22:06.901853 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd\": container with ID starting with 9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd not found: ID does not exist" containerID="9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd" Oct 10 07:22:06 crc kubenswrapper[4822]: I1010 07:22:06.901895 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd"} err="failed to get container status \"9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd\": rpc error: code = NotFound desc = could not find container \"9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd\": container with ID starting with 9ac5c418f0681da475ee4ee84c0d009380f9e1cb4e63a966e14381f4c25ebbfd not found: ID does not exist" Oct 10 07:22:07 crc kubenswrapper[4822]: I1010 07:22:07.182302 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95z7x"] Oct 10 07:22:07 crc kubenswrapper[4822]: I1010 07:22:07.197410 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95z7x"] Oct 10 07:22:07 crc kubenswrapper[4822]: I1010 07:22:07.665927 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" path="/var/lib/kubelet/pods/c5ad5ef3-5771-4170-93af-8e95d561efe7/volumes" Oct 10 07:23:31 crc kubenswrapper[4822]: I1010 07:23:31.336590 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:23:31 crc kubenswrapper[4822]: I1010 07:23:31.337555 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:24:01 crc kubenswrapper[4822]: I1010 07:24:01.337334 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:24:01 crc kubenswrapper[4822]: I1010 07:24:01.338203 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:24:31 crc kubenswrapper[4822]: I1010 07:24:31.337144 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:24:31 crc kubenswrapper[4822]: I1010 07:24:31.337907 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:24:31 crc kubenswrapper[4822]: I1010 07:24:31.337967 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:24:31 crc kubenswrapper[4822]: I1010 07:24:31.338674 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:24:31 crc kubenswrapper[4822]: I1010 07:24:31.338766 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" gracePeriod=600 Oct 10 07:24:31 crc kubenswrapper[4822]: E1010 07:24:31.463569 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:24:32 crc kubenswrapper[4822]: I1010 07:24:32.091193 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" exitCode=0 Oct 10 07:24:32 crc kubenswrapper[4822]: I1010 07:24:32.091276 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c"} Oct 10 07:24:32 crc kubenswrapper[4822]: I1010 07:24:32.091341 4822 scope.go:117] "RemoveContainer" containerID="2fba85c42d900de94cc2de960a8c0e1af30733ca35d7cc574ce9ab5a7370b460" Oct 10 07:24:32 crc kubenswrapper[4822]: I1010 07:24:32.092210 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:24:32 crc kubenswrapper[4822]: E1010 07:24:32.092702 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.171665 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dc9th"] Oct 10 07:24:38 crc kubenswrapper[4822]: E1010 07:24:38.179060 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="extract-content" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.179082 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="extract-content" Oct 10 07:24:38 crc kubenswrapper[4822]: E1010 07:24:38.179106 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="extract-utilities" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.179115 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="extract-utilities" Oct 10 07:24:38 crc kubenswrapper[4822]: E1010 07:24:38.179141 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="registry-server" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.179150 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="registry-server" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.179336 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ad5ef3-5771-4170-93af-8e95d561efe7" containerName="registry-server" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.180624 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.192531 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc9th"] Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.241062 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpmz\" (UniqueName: \"kubernetes.io/projected/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-kube-api-access-6tpmz\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.241112 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-catalog-content\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.241144 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-utilities\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.342675 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpmz\" (UniqueName: \"kubernetes.io/projected/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-kube-api-access-6tpmz\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.342745 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-catalog-content\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.342785 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-utilities\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.343324 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-utilities\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.343336 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-catalog-content\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.376547 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpmz\" (UniqueName: \"kubernetes.io/projected/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-kube-api-access-6tpmz\") pod \"certified-operators-dc9th\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:38 crc kubenswrapper[4822]: I1010 07:24:38.509180 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:39 crc kubenswrapper[4822]: I1010 07:24:39.079827 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc9th"] Oct 10 07:24:39 crc kubenswrapper[4822]: I1010 07:24:39.169221 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc9th" event={"ID":"cdec9403-2ab9-483e-b62f-e3e4f1bef95d","Type":"ContainerStarted","Data":"6f2cc04e902555d1f4b021e9be4a998d08cb77d7b13c934f9c70426ea334f672"} Oct 10 07:24:40 crc kubenswrapper[4822]: I1010 07:24:40.178430 4822 generic.go:334] "Generic (PLEG): container finished" podID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerID="ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15" exitCode=0 Oct 10 07:24:40 crc kubenswrapper[4822]: I1010 07:24:40.178530 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc9th" event={"ID":"cdec9403-2ab9-483e-b62f-e3e4f1bef95d","Type":"ContainerDied","Data":"ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15"} Oct 10 07:24:41 crc kubenswrapper[4822]: I1010 07:24:41.192496 4822 generic.go:334] "Generic (PLEG): container finished" podID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerID="b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314" exitCode=0 Oct 10 07:24:41 crc kubenswrapper[4822]: I1010 07:24:41.192540 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc9th" event={"ID":"cdec9403-2ab9-483e-b62f-e3e4f1bef95d","Type":"ContainerDied","Data":"b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314"} Oct 10 07:24:42 crc kubenswrapper[4822]: I1010 07:24:42.203028 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc9th" event={"ID":"cdec9403-2ab9-483e-b62f-e3e4f1bef95d","Type":"ContainerStarted","Data":"ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700"} Oct 10 07:24:42 crc kubenswrapper[4822]: I1010 07:24:42.223795 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dc9th" podStartSLOduration=2.704734143 podStartE2EDuration="4.223763922s" podCreationTimestamp="2025-10-10 07:24:38 +0000 UTC" firstStartedPulling="2025-10-10 07:24:40.184221882 +0000 UTC m=+3627.279380078" lastFinishedPulling="2025-10-10 07:24:41.703251641 +0000 UTC m=+3628.798409857" observedRunningTime="2025-10-10 07:24:42.222296349 +0000 UTC m=+3629.317454575" watchObservedRunningTime="2025-10-10 07:24:42.223763922 +0000 UTC m=+3629.318922158" Oct 10 07:24:46 crc kubenswrapper[4822]: I1010 07:24:46.650174 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:24:46 crc kubenswrapper[4822]: E1010 07:24:46.650902 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:24:48 crc kubenswrapper[4822]: I1010 07:24:48.510072 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:48 crc kubenswrapper[4822]: I1010 07:24:48.510164 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:48 crc kubenswrapper[4822]: I1010 07:24:48.570386 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:49 crc kubenswrapper[4822]: I1010 07:24:49.343119 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:49 crc kubenswrapper[4822]: I1010 07:24:49.409238 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc9th"] Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.277422 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dc9th" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="registry-server" containerID="cri-o://ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700" gracePeriod=2 Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.755167 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.852508 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-utilities\") pod \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.852555 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-catalog-content\") pod \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.852591 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpmz\" (UniqueName: \"kubernetes.io/projected/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-kube-api-access-6tpmz\") pod \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\" (UID: \"cdec9403-2ab9-483e-b62f-e3e4f1bef95d\") " Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.854865 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-utilities" (OuterVolumeSpecName: "utilities") pod "cdec9403-2ab9-483e-b62f-e3e4f1bef95d" (UID: "cdec9403-2ab9-483e-b62f-e3e4f1bef95d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.859646 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-kube-api-access-6tpmz" (OuterVolumeSpecName: "kube-api-access-6tpmz") pod "cdec9403-2ab9-483e-b62f-e3e4f1bef95d" (UID: "cdec9403-2ab9-483e-b62f-e3e4f1bef95d"). InnerVolumeSpecName "kube-api-access-6tpmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.909496 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdec9403-2ab9-483e-b62f-e3e4f1bef95d" (UID: "cdec9403-2ab9-483e-b62f-e3e4f1bef95d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.954416 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.954468 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:24:51 crc kubenswrapper[4822]: I1010 07:24:51.954481 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tpmz\" (UniqueName: \"kubernetes.io/projected/cdec9403-2ab9-483e-b62f-e3e4f1bef95d-kube-api-access-6tpmz\") on node \"crc\" DevicePath \"\"" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.300245 4822 generic.go:334] "Generic (PLEG): container finished" podID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerID="ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700" exitCode=0 Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.300323 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc9th" event={"ID":"cdec9403-2ab9-483e-b62f-e3e4f1bef95d","Type":"ContainerDied","Data":"ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700"} Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.300388 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc9th" event={"ID":"cdec9403-2ab9-483e-b62f-e3e4f1bef95d","Type":"ContainerDied","Data":"6f2cc04e902555d1f4b021e9be4a998d08cb77d7b13c934f9c70426ea334f672"} Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.300424 4822 scope.go:117] "RemoveContainer" containerID="ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.300930 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc9th" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.343888 4822 scope.go:117] "RemoveContainer" containerID="b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.366067 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc9th"] Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.379162 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dc9th"] Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.389315 4822 scope.go:117] "RemoveContainer" containerID="ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.411945 4822 scope.go:117] "RemoveContainer" containerID="ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700" Oct 10 07:24:52 crc kubenswrapper[4822]: E1010 07:24:52.412454 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700\": container with ID starting with ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700 not found: ID does not exist" containerID="ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.412500 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700"} err="failed to get container status \"ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700\": rpc error: code = NotFound desc = could not find container \"ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700\": container with ID starting with ca58ee9b8bc27660b655ed1340479c54bca87c8d8ff3b1fbac38da055a1dc700 not found: ID does not exist" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.412530 4822 scope.go:117] "RemoveContainer" containerID="b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314" Oct 10 07:24:52 crc kubenswrapper[4822]: E1010 07:24:52.413100 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314\": container with ID starting with b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314 not found: ID does not exist" containerID="b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.413260 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314"} err="failed to get container status \"b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314\": rpc error: code = NotFound desc = could not find container \"b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314\": container with ID starting with b4cffb8d62e0ed5c7ae57b17296b628ab61a0a47d67de2abdf6c5882e162e314 not found: ID does not exist" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.413391 4822 scope.go:117] "RemoveContainer" containerID="ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15" Oct 10 07:24:52 crc kubenswrapper[4822]: E1010 07:24:52.413959 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15\": container with ID starting with ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15 not found: ID does not exist" containerID="ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15" Oct 10 07:24:52 crc kubenswrapper[4822]: I1010 07:24:52.414031 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15"} err="failed to get container status \"ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15\": rpc error: code = NotFound desc = could not find container \"ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15\": container with ID starting with ff251ee481c9527f8230bee918f8d6c145c6492211cbdc402c756f7f249f6a15 not found: ID does not exist" Oct 10 07:24:53 crc kubenswrapper[4822]: I1010 07:24:53.669054 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" path="/var/lib/kubelet/pods/cdec9403-2ab9-483e-b62f-e3e4f1bef95d/volumes" Oct 10 07:25:01 crc kubenswrapper[4822]: I1010 07:25:01.651636 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:25:01 crc kubenswrapper[4822]: E1010 07:25:01.653055 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:25:13 crc kubenswrapper[4822]: I1010 07:25:13.656308 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:25:13 crc kubenswrapper[4822]: E1010 07:25:13.657237 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:25:28 crc kubenswrapper[4822]: I1010 07:25:28.650952 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:25:28 crc kubenswrapper[4822]: E1010 07:25:28.652155 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:25:43 crc kubenswrapper[4822]: I1010 07:25:43.661201 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:25:43 crc kubenswrapper[4822]: E1010 07:25:43.661936 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:25:56 crc kubenswrapper[4822]: I1010 07:25:56.650235 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:25:56 crc kubenswrapper[4822]: E1010 07:25:56.650949 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:26:09 crc kubenswrapper[4822]: I1010 07:26:09.708680 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:26:09 crc kubenswrapper[4822]: E1010 07:26:09.709371 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:26:23 crc kubenswrapper[4822]: I1010 07:26:23.650588 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:26:23 crc kubenswrapper[4822]: E1010 07:26:23.651286 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:26:36 crc kubenswrapper[4822]: I1010 07:26:36.650103 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:26:36 crc kubenswrapper[4822]: E1010 07:26:36.650701 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:26:50 crc kubenswrapper[4822]: I1010 07:26:50.650174 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:26:50 crc kubenswrapper[4822]: E1010 07:26:50.650983 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:27:04 crc kubenswrapper[4822]: I1010 07:27:04.651246 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:27:04 crc kubenswrapper[4822]: E1010 07:27:04.652405 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:27:19 crc kubenswrapper[4822]: I1010 07:27:19.650976 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:27:19 crc kubenswrapper[4822]: E1010 07:27:19.652287 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:27:32 crc kubenswrapper[4822]: I1010 07:27:32.650306 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:27:32 crc kubenswrapper[4822]: E1010 07:27:32.651438 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:27:46 crc kubenswrapper[4822]: I1010 07:27:46.650969 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:27:46 crc kubenswrapper[4822]: E1010 07:27:46.651744 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:28:00 crc kubenswrapper[4822]: I1010 07:28:00.650736 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:28:00 crc kubenswrapper[4822]: E1010 07:28:00.651496 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:28:15 crc kubenswrapper[4822]: I1010 07:28:15.651119 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:28:15 crc kubenswrapper[4822]: E1010 07:28:15.652377 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:28:30 crc kubenswrapper[4822]: I1010 07:28:30.650767 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:28:30 crc kubenswrapper[4822]: E1010 07:28:30.651773 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:28:44 crc kubenswrapper[4822]: I1010 07:28:44.650141 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:28:44 crc kubenswrapper[4822]: E1010 07:28:44.651038 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:28:57 crc kubenswrapper[4822]: I1010 07:28:57.650638 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:28:57 crc kubenswrapper[4822]: E1010 07:28:57.651431 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:29:11 crc kubenswrapper[4822]: I1010 07:29:11.650731 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:29:11 crc kubenswrapper[4822]: E1010 07:29:11.651765 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:29:25 crc kubenswrapper[4822]: I1010 07:29:25.650525 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:29:25 crc kubenswrapper[4822]: E1010 07:29:25.651440 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:29:40 crc kubenswrapper[4822]: I1010 07:29:40.650981 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:29:41 crc kubenswrapper[4822]: I1010 07:29:41.787760 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"31c516e74088fb46136379a06087e2eaba688082bdf389c465441c4e7785f356"} Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.152256 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls"] Oct 10 07:30:00 crc kubenswrapper[4822]: E1010 07:30:00.153963 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.154057 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4822]: E1010 07:30:00.154147 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="extract-utilities" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.154163 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="extract-utilities" Oct 10 07:30:00 crc kubenswrapper[4822]: E1010 07:30:00.154224 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="extract-content" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.154239 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="extract-content" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.154692 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdec9403-2ab9-483e-b62f-e3e4f1bef95d" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.157153 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.159115 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.159664 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.164822 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls"] Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.279883 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-config-volume\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.279923 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-secret-volume\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.279980 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6m77\" (UniqueName: \"kubernetes.io/projected/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-kube-api-access-f6m77\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.381591 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-config-volume\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.381644 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-secret-volume\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.381713 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6m77\" (UniqueName: \"kubernetes.io/projected/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-kube-api-access-f6m77\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.382673 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-config-volume\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.387861 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-secret-volume\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.399008 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6m77\" (UniqueName: \"kubernetes.io/projected/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-kube-api-access-f6m77\") pod \"collect-profiles-29334690-vfjls\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.478634 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.865231 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls"] Oct 10 07:30:00 crc kubenswrapper[4822]: I1010 07:30:00.948625 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" event={"ID":"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd","Type":"ContainerStarted","Data":"5c5727b31097a23965b9c5710db00d935bd7921782bd9ffd735fa3099ad898a8"} Oct 10 07:30:01 crc kubenswrapper[4822]: I1010 07:30:01.962854 4822 generic.go:334] "Generic (PLEG): container finished" podID="c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" containerID="ae0a478de330f43220f4507c8bc6d37313574d0121d7c7723cb350513f215dba" exitCode=0 Oct 10 07:30:01 crc kubenswrapper[4822]: I1010 07:30:01.962956 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" event={"ID":"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd","Type":"ContainerDied","Data":"ae0a478de330f43220f4507c8bc6d37313574d0121d7c7723cb350513f215dba"} Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.269792 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.327288 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-secret-volume\") pod \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.327374 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6m77\" (UniqueName: \"kubernetes.io/projected/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-kube-api-access-f6m77\") pod \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.327421 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-config-volume\") pod \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\" (UID: \"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd\") " Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.328527 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" (UID: "c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.332768 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-kube-api-access-f6m77" (OuterVolumeSpecName: "kube-api-access-f6m77") pod "c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" (UID: "c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd"). InnerVolumeSpecName "kube-api-access-f6m77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.332882 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" (UID: "c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.429432 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.429486 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6m77\" (UniqueName: \"kubernetes.io/projected/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-kube-api-access-f6m77\") on node \"crc\" DevicePath \"\"" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.429504 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.995102 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" event={"ID":"c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd","Type":"ContainerDied","Data":"5c5727b31097a23965b9c5710db00d935bd7921782bd9ffd735fa3099ad898a8"} Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.995867 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c5727b31097a23965b9c5710db00d935bd7921782bd9ffd735fa3099ad898a8" Oct 10 07:30:03 crc kubenswrapper[4822]: I1010 07:30:03.995392 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls" Oct 10 07:30:04 crc kubenswrapper[4822]: I1010 07:30:04.334007 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5"] Oct 10 07:30:04 crc kubenswrapper[4822]: I1010 07:30:04.339228 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-8pgt5"] Oct 10 07:30:05 crc kubenswrapper[4822]: I1010 07:30:05.659528 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7af8772-83f7-4add-b635-3ceda2a642d7" path="/var/lib/kubelet/pods/c7af8772-83f7-4add-b635-3ceda2a642d7/volumes" Oct 10 07:30:22 crc kubenswrapper[4822]: I1010 07:30:22.056102 4822 scope.go:117] "RemoveContainer" containerID="2b0a4c86aa4a5d5f5e49b7088c533c1387da56f53e88bd7617c63d83ab2dcb9f" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.522457 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6wz4"] Oct 10 07:30:45 crc kubenswrapper[4822]: E1010 07:30:45.523587 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" containerName="collect-profiles" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.523603 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" containerName="collect-profiles" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.523826 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" containerName="collect-profiles" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.525042 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.559033 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6wz4"] Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.666452 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjmv\" (UniqueName: \"kubernetes.io/projected/5bca889e-d4b2-425c-9238-bbd38169d397-kube-api-access-cfjmv\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.666529 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bca889e-d4b2-425c-9238-bbd38169d397-utilities\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.666578 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bca889e-d4b2-425c-9238-bbd38169d397-catalog-content\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.768154 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bca889e-d4b2-425c-9238-bbd38169d397-utilities\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.768241 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bca889e-d4b2-425c-9238-bbd38169d397-catalog-content\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.768287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjmv\" (UniqueName: \"kubernetes.io/projected/5bca889e-d4b2-425c-9238-bbd38169d397-kube-api-access-cfjmv\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.769113 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bca889e-d4b2-425c-9238-bbd38169d397-utilities\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.769384 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bca889e-d4b2-425c-9238-bbd38169d397-catalog-content\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.790679 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjmv\" (UniqueName: \"kubernetes.io/projected/5bca889e-d4b2-425c-9238-bbd38169d397-kube-api-access-cfjmv\") pod \"community-operators-m6wz4\" (UID: \"5bca889e-d4b2-425c-9238-bbd38169d397\") " pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:45 crc kubenswrapper[4822]: I1010 07:30:45.861065 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:46 crc kubenswrapper[4822]: I1010 07:30:46.403389 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6wz4"] Oct 10 07:30:46 crc kubenswrapper[4822]: I1010 07:30:46.447241 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6wz4" event={"ID":"5bca889e-d4b2-425c-9238-bbd38169d397","Type":"ContainerStarted","Data":"fd06ec14c1aaf152aba22135bab3ba2bddda9271c98f1155167ec7c90f4a4986"} Oct 10 07:30:47 crc kubenswrapper[4822]: I1010 07:30:47.459957 4822 generic.go:334] "Generic (PLEG): container finished" podID="5bca889e-d4b2-425c-9238-bbd38169d397" containerID="cad2ffd6591f2b06176456b2ed0511ddcc76df6b9f7df7853805961adfb8631d" exitCode=0 Oct 10 07:30:47 crc kubenswrapper[4822]: I1010 07:30:47.460065 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6wz4" event={"ID":"5bca889e-d4b2-425c-9238-bbd38169d397","Type":"ContainerDied","Data":"cad2ffd6591f2b06176456b2ed0511ddcc76df6b9f7df7853805961adfb8631d"} Oct 10 07:30:47 crc kubenswrapper[4822]: I1010 07:30:47.463158 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:30:51 crc kubenswrapper[4822]: I1010 07:30:51.501136 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6wz4" event={"ID":"5bca889e-d4b2-425c-9238-bbd38169d397","Type":"ContainerStarted","Data":"f01cde67d6d2c6865830e6e8838e1b3782ce49521f6a6086cd54067bcba737f2"} Oct 10 07:30:51 crc kubenswrapper[4822]: E1010 07:30:51.864158 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bca889e_d4b2_425c_9238_bbd38169d397.slice/crio-conmon-f01cde67d6d2c6865830e6e8838e1b3782ce49521f6a6086cd54067bcba737f2.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:30:52 crc kubenswrapper[4822]: I1010 07:30:52.514798 4822 generic.go:334] "Generic (PLEG): container finished" podID="5bca889e-d4b2-425c-9238-bbd38169d397" containerID="f01cde67d6d2c6865830e6e8838e1b3782ce49521f6a6086cd54067bcba737f2" exitCode=0 Oct 10 07:30:52 crc kubenswrapper[4822]: I1010 07:30:52.514919 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6wz4" event={"ID":"5bca889e-d4b2-425c-9238-bbd38169d397","Type":"ContainerDied","Data":"f01cde67d6d2c6865830e6e8838e1b3782ce49521f6a6086cd54067bcba737f2"} Oct 10 07:30:53 crc kubenswrapper[4822]: I1010 07:30:53.523558 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6wz4" event={"ID":"5bca889e-d4b2-425c-9238-bbd38169d397","Type":"ContainerStarted","Data":"a0c84c04c39fc3b4f693ca4ccc4739b67fe6efd157f74b563449698b2f8a62fa"} Oct 10 07:30:53 crc kubenswrapper[4822]: I1010 07:30:53.548194 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6wz4" podStartSLOduration=3.073764546 podStartE2EDuration="8.548166907s" podCreationTimestamp="2025-10-10 07:30:45 +0000 UTC" firstStartedPulling="2025-10-10 07:30:47.46268524 +0000 UTC m=+3994.557843476" lastFinishedPulling="2025-10-10 07:30:52.937087621 +0000 UTC m=+4000.032245837" observedRunningTime="2025-10-10 07:30:53.542771483 +0000 UTC m=+4000.637929719" watchObservedRunningTime="2025-10-10 07:30:53.548166907 +0000 UTC m=+4000.643325103" Oct 10 07:30:55 crc kubenswrapper[4822]: I1010 07:30:55.861557 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:55 crc kubenswrapper[4822]: I1010 07:30:55.861599 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:30:55 crc kubenswrapper[4822]: I1010 07:30:55.903086 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:31:05 crc kubenswrapper[4822]: I1010 07:31:05.904730 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6wz4" Oct 10 07:31:05 crc kubenswrapper[4822]: I1010 07:31:05.980107 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6wz4"] Oct 10 07:31:06 crc kubenswrapper[4822]: I1010 07:31:06.015826 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-724mg"] Oct 10 07:31:06 crc kubenswrapper[4822]: I1010 07:31:06.620196 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-724mg" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="registry-server" containerID="cri-o://aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599" gracePeriod=2 Oct 10 07:31:06 crc kubenswrapper[4822]: I1010 07:31:06.976264 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-724mg" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.083071 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-utilities\") pod \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.083160 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-catalog-content\") pod \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.083189 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnj4p\" (UniqueName: \"kubernetes.io/projected/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-kube-api-access-tnj4p\") pod \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\" (UID: \"c48094cd-a9ab-4c00-9e04-cc5bcaa99716\") " Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.083559 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-utilities" (OuterVolumeSpecName: "utilities") pod "c48094cd-a9ab-4c00-9e04-cc5bcaa99716" (UID: "c48094cd-a9ab-4c00-9e04-cc5bcaa99716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.092028 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-kube-api-access-tnj4p" (OuterVolumeSpecName: "kube-api-access-tnj4p") pod "c48094cd-a9ab-4c00-9e04-cc5bcaa99716" (UID: "c48094cd-a9ab-4c00-9e04-cc5bcaa99716"). InnerVolumeSpecName "kube-api-access-tnj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.125709 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c48094cd-a9ab-4c00-9e04-cc5bcaa99716" (UID: "c48094cd-a9ab-4c00-9e04-cc5bcaa99716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.184285 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.184327 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.184340 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnj4p\" (UniqueName: \"kubernetes.io/projected/c48094cd-a9ab-4c00-9e04-cc5bcaa99716-kube-api-access-tnj4p\") on node \"crc\" DevicePath \"\"" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.629323 4822 generic.go:334] "Generic (PLEG): container finished" podID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerID="aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599" exitCode=0 Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.629373 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerDied","Data":"aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599"} Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.629397 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-724mg" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.629474 4822 scope.go:117] "RemoveContainer" containerID="aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.629458 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-724mg" event={"ID":"c48094cd-a9ab-4c00-9e04-cc5bcaa99716","Type":"ContainerDied","Data":"fbd536e1b072ebb7ccb1029f09f2662ed42b0cb1c284ed6a4244ae3970db0ce5"} Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.651667 4822 scope.go:117] "RemoveContainer" containerID="71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.675276 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-724mg"] Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.675334 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-724mg"] Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.680770 4822 scope.go:117] "RemoveContainer" containerID="0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.706580 4822 scope.go:117] "RemoveContainer" containerID="aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599" Oct 10 07:31:07 crc kubenswrapper[4822]: E1010 07:31:07.707373 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599\": container with ID starting with aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599 not found: ID does not exist" containerID="aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.707416 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599"} err="failed to get container status \"aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599\": rpc error: code = NotFound desc = could not find container \"aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599\": container with ID starting with aa13673e9550c03156fea22899c43326b020c6eaf95c377222dc9f99c05fd599 not found: ID does not exist" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.707442 4822 scope.go:117] "RemoveContainer" containerID="71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33" Oct 10 07:31:07 crc kubenswrapper[4822]: E1010 07:31:07.707839 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33\": container with ID starting with 71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33 not found: ID does not exist" containerID="71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.707888 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33"} err="failed to get container status \"71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33\": rpc error: code = NotFound desc = could not find container \"71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33\": container with ID starting with 71488e06386f0ac7461b198773d3a279a314c5a4712d6deb44628dab602b3f33 not found: ID does not exist" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.707922 4822 scope.go:117] "RemoveContainer" containerID="0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304" Oct 10 07:31:07 crc kubenswrapper[4822]: E1010 07:31:07.708242 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304\": container with ID starting with 0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304 not found: ID does not exist" containerID="0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304" Oct 10 07:31:07 crc kubenswrapper[4822]: I1010 07:31:07.708275 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304"} err="failed to get container status \"0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304\": rpc error: code = NotFound desc = could not find container \"0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304\": container with ID starting with 0a2f4be1a4469c1e14db435003d0c1ab1bf78f56719bb07f84f6aec17a608304 not found: ID does not exist" Oct 10 07:31:09 crc kubenswrapper[4822]: I1010 07:31:09.661378 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" path="/var/lib/kubelet/pods/c48094cd-a9ab-4c00-9e04-cc5bcaa99716/volumes" Oct 10 07:32:01 crc kubenswrapper[4822]: I1010 07:32:01.337179 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:32:01 crc kubenswrapper[4822]: I1010 07:32:01.337690 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:32:31 crc kubenswrapper[4822]: I1010 07:32:31.336718 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:32:31 crc kubenswrapper[4822]: I1010 07:32:31.337548 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.337406 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.338206 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.338285 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.339375 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31c516e74088fb46136379a06087e2eaba688082bdf389c465441c4e7785f356"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.339491 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://31c516e74088fb46136379a06087e2eaba688082bdf389c465441c4e7785f356" gracePeriod=600 Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.639324 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="31c516e74088fb46136379a06087e2eaba688082bdf389c465441c4e7785f356" exitCode=0 Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.639402 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"31c516e74088fb46136379a06087e2eaba688082bdf389c465441c4e7785f356"} Oct 10 07:33:01 crc kubenswrapper[4822]: I1010 07:33:01.639577 4822 scope.go:117] "RemoveContainer" containerID="39db340a18691bb6183f2319e3527e1dead2358636453782fb858c0354536c4c" Oct 10 07:33:02 crc kubenswrapper[4822]: I1010 07:33:02.659154 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc"} Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.912962 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fdzd"] Oct 10 07:33:08 crc kubenswrapper[4822]: E1010 07:33:08.914021 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="extract-content" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.914045 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="extract-content" Oct 10 07:33:08 crc kubenswrapper[4822]: E1010 07:33:08.914081 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="registry-server" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.914095 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="registry-server" Oct 10 07:33:08 crc kubenswrapper[4822]: E1010 07:33:08.914120 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="extract-utilities" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.914133 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="extract-utilities" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.914363 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48094cd-a9ab-4c00-9e04-cc5bcaa99716" containerName="registry-server" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.915830 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.932274 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdzd"] Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.944445 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-utilities\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.944657 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-catalog-content\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:08 crc kubenswrapper[4822]: I1010 07:33:08.944748 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l84x9\" (UniqueName: \"kubernetes.io/projected/8207c27b-a38f-47d6-99c7-79c5304dbee9-kube-api-access-l84x9\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.045367 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-catalog-content\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.045435 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l84x9\" (UniqueName: \"kubernetes.io/projected/8207c27b-a38f-47d6-99c7-79c5304dbee9-kube-api-access-l84x9\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.045522 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-utilities\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.045996 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-utilities\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.046223 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-catalog-content\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.072008 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l84x9\" (UniqueName: \"kubernetes.io/projected/8207c27b-a38f-47d6-99c7-79c5304dbee9-kube-api-access-l84x9\") pod \"redhat-operators-5fdzd\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.240247 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:09 crc kubenswrapper[4822]: I1010 07:33:09.714759 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdzd"] Oct 10 07:33:10 crc kubenswrapper[4822]: I1010 07:33:10.729583 4822 generic.go:334] "Generic (PLEG): container finished" podID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerID="fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e" exitCode=0 Oct 10 07:33:10 crc kubenswrapper[4822]: I1010 07:33:10.729633 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdzd" event={"ID":"8207c27b-a38f-47d6-99c7-79c5304dbee9","Type":"ContainerDied","Data":"fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e"} Oct 10 07:33:10 crc kubenswrapper[4822]: I1010 07:33:10.730010 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdzd" event={"ID":"8207c27b-a38f-47d6-99c7-79c5304dbee9","Type":"ContainerStarted","Data":"157f3d606d87bc6394069a985486de818f95c429ee94c0d424eb453d1a76a425"} Oct 10 07:33:12 crc kubenswrapper[4822]: I1010 07:33:12.755304 4822 generic.go:334] "Generic (PLEG): container finished" podID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerID="a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5" exitCode=0 Oct 10 07:33:12 crc kubenswrapper[4822]: I1010 07:33:12.755421 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdzd" event={"ID":"8207c27b-a38f-47d6-99c7-79c5304dbee9","Type":"ContainerDied","Data":"a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5"} Oct 10 07:33:13 crc kubenswrapper[4822]: I1010 07:33:13.768897 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdzd" event={"ID":"8207c27b-a38f-47d6-99c7-79c5304dbee9","Type":"ContainerStarted","Data":"294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d"} Oct 10 07:33:13 crc kubenswrapper[4822]: I1010 07:33:13.788900 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fdzd" podStartSLOduration=3.395861917 podStartE2EDuration="5.788870637s" podCreationTimestamp="2025-10-10 07:33:08 +0000 UTC" firstStartedPulling="2025-10-10 07:33:10.731545403 +0000 UTC m=+4137.826703629" lastFinishedPulling="2025-10-10 07:33:13.124554153 +0000 UTC m=+4140.219712349" observedRunningTime="2025-10-10 07:33:13.784890063 +0000 UTC m=+4140.880048309" watchObservedRunningTime="2025-10-10 07:33:13.788870637 +0000 UTC m=+4140.884028843" Oct 10 07:33:19 crc kubenswrapper[4822]: I1010 07:33:19.241274 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:19 crc kubenswrapper[4822]: I1010 07:33:19.242559 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:19 crc kubenswrapper[4822]: I1010 07:33:19.302688 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:19 crc kubenswrapper[4822]: I1010 07:33:19.887222 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:19 crc kubenswrapper[4822]: I1010 07:33:19.938846 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdzd"] Oct 10 07:33:21 crc kubenswrapper[4822]: I1010 07:33:21.841000 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fdzd" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="registry-server" containerID="cri-o://294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d" gracePeriod=2 Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.225338 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.347911 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-catalog-content\") pod \"8207c27b-a38f-47d6-99c7-79c5304dbee9\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.348093 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l84x9\" (UniqueName: \"kubernetes.io/projected/8207c27b-a38f-47d6-99c7-79c5304dbee9-kube-api-access-l84x9\") pod \"8207c27b-a38f-47d6-99c7-79c5304dbee9\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.348147 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-utilities\") pod \"8207c27b-a38f-47d6-99c7-79c5304dbee9\" (UID: \"8207c27b-a38f-47d6-99c7-79c5304dbee9\") " Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.349844 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-utilities" (OuterVolumeSpecName: "utilities") pod "8207c27b-a38f-47d6-99c7-79c5304dbee9" (UID: "8207c27b-a38f-47d6-99c7-79c5304dbee9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.354710 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8207c27b-a38f-47d6-99c7-79c5304dbee9-kube-api-access-l84x9" (OuterVolumeSpecName: "kube-api-access-l84x9") pod "8207c27b-a38f-47d6-99c7-79c5304dbee9" (UID: "8207c27b-a38f-47d6-99c7-79c5304dbee9"). InnerVolumeSpecName "kube-api-access-l84x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.450858 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l84x9\" (UniqueName: \"kubernetes.io/projected/8207c27b-a38f-47d6-99c7-79c5304dbee9-kube-api-access-l84x9\") on node \"crc\" DevicePath \"\"" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.450895 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.515027 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8207c27b-a38f-47d6-99c7-79c5304dbee9" (UID: "8207c27b-a38f-47d6-99c7-79c5304dbee9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.552523 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8207c27b-a38f-47d6-99c7-79c5304dbee9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.856555 4822 generic.go:334] "Generic (PLEG): container finished" podID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerID="294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d" exitCode=0 Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.856620 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdzd" event={"ID":"8207c27b-a38f-47d6-99c7-79c5304dbee9","Type":"ContainerDied","Data":"294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d"} Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.856660 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdzd" event={"ID":"8207c27b-a38f-47d6-99c7-79c5304dbee9","Type":"ContainerDied","Data":"157f3d606d87bc6394069a985486de818f95c429ee94c0d424eb453d1a76a425"} Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.856692 4822 scope.go:117] "RemoveContainer" containerID="294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.856918 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdzd" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.893067 4822 scope.go:117] "RemoveContainer" containerID="a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.913351 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdzd"] Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.922735 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fdzd"] Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.945500 4822 scope.go:117] "RemoveContainer" containerID="fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.970250 4822 scope.go:117] "RemoveContainer" containerID="294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d" Oct 10 07:33:22 crc kubenswrapper[4822]: E1010 07:33:22.971041 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d\": container with ID starting with 294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d not found: ID does not exist" containerID="294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.971110 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d"} err="failed to get container status \"294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d\": rpc error: code = NotFound desc = could not find container \"294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d\": container with ID starting with 294ee598becf9f86fcb59f83e2334b21350295aac33b5e724360619b2eef953d not found: ID does not exist" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.971158 4822 scope.go:117] "RemoveContainer" containerID="a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5" Oct 10 07:33:22 crc kubenswrapper[4822]: E1010 07:33:22.971734 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5\": container with ID starting with a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5 not found: ID does not exist" containerID="a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.971774 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5"} err="failed to get container status \"a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5\": rpc error: code = NotFound desc = could not find container \"a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5\": container with ID starting with a27dbc468c4b0496e96e95effb6f1a927cb53abc520587a7161cf1e8468781e5 not found: ID does not exist" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.971831 4822 scope.go:117] "RemoveContainer" containerID="fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e" Oct 10 07:33:22 crc kubenswrapper[4822]: E1010 07:33:22.972361 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e\": container with ID starting with fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e not found: ID does not exist" containerID="fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e" Oct 10 07:33:22 crc kubenswrapper[4822]: I1010 07:33:22.972406 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e"} err="failed to get container status \"fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e\": rpc error: code = NotFound desc = could not find container \"fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e\": container with ID starting with fa17b93aca0abb7568374d5247621ffc682b4150ab34bdfc5374ef553ecb7f1e not found: ID does not exist" Oct 10 07:33:23 crc kubenswrapper[4822]: I1010 07:33:23.660161 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" path="/var/lib/kubelet/pods/8207c27b-a38f-47d6-99c7-79c5304dbee9/volumes" Oct 10 07:35:01 crc kubenswrapper[4822]: I1010 07:35:01.336894 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:35:01 crc kubenswrapper[4822]: I1010 07:35:01.337738 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.455407 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dgx56"] Oct 10 07:35:23 crc kubenswrapper[4822]: E1010 07:35:23.457569 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="registry-server" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.457586 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="registry-server" Oct 10 07:35:23 crc kubenswrapper[4822]: E1010 07:35:23.457611 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="extract-utilities" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.457620 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="extract-utilities" Oct 10 07:35:23 crc kubenswrapper[4822]: E1010 07:35:23.457635 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="extract-content" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.457645 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="extract-content" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.457856 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8207c27b-a38f-47d6-99c7-79c5304dbee9" containerName="registry-server" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.459181 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.476016 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgx56"] Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.648157 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48476\" (UniqueName: \"kubernetes.io/projected/8c2c9462-8354-450f-9ad2-43ea510d66a2-kube-api-access-48476\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.648259 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-catalog-content\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.648302 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-utilities\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.749707 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-utilities\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.749789 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48476\" (UniqueName: \"kubernetes.io/projected/8c2c9462-8354-450f-9ad2-43ea510d66a2-kube-api-access-48476\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.749858 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-catalog-content\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.750187 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-utilities\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.750242 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-catalog-content\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:23 crc kubenswrapper[4822]: I1010 07:35:23.795703 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48476\" (UniqueName: \"kubernetes.io/projected/8c2c9462-8354-450f-9ad2-43ea510d66a2-kube-api-access-48476\") pod \"certified-operators-dgx56\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:24 crc kubenswrapper[4822]: I1010 07:35:24.079364 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:24 crc kubenswrapper[4822]: I1010 07:35:24.587949 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgx56"] Oct 10 07:35:24 crc kubenswrapper[4822]: I1010 07:35:24.884436 4822 generic.go:334] "Generic (PLEG): container finished" podID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerID="6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c" exitCode=0 Oct 10 07:35:24 crc kubenswrapper[4822]: I1010 07:35:24.884561 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerDied","Data":"6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c"} Oct 10 07:35:24 crc kubenswrapper[4822]: I1010 07:35:24.884720 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerStarted","Data":"a7d3020165a364a7a9bb2a562201fb7f0d2946942b155ba3879586c44e3fac5c"} Oct 10 07:35:25 crc kubenswrapper[4822]: I1010 07:35:25.895367 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerStarted","Data":"14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007"} Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.265873 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4wpd"] Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.268959 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.271275 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4wpd"] Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.398923 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-utilities\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.399074 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8m7n\" (UniqueName: \"kubernetes.io/projected/8a016559-c659-48f0-b72f-63a2d344018a-kube-api-access-t8m7n\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.399124 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-catalog-content\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.500486 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-catalog-content\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.500610 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-utilities\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.500684 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8m7n\" (UniqueName: \"kubernetes.io/projected/8a016559-c659-48f0-b72f-63a2d344018a-kube-api-access-t8m7n\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.501455 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-catalog-content\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.501568 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-utilities\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.530247 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8m7n\" (UniqueName: \"kubernetes.io/projected/8a016559-c659-48f0-b72f-63a2d344018a-kube-api-access-t8m7n\") pod \"redhat-marketplace-x4wpd\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.586572 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.905055 4822 generic.go:334] "Generic (PLEG): container finished" podID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerID="14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007" exitCode=0 Oct 10 07:35:26 crc kubenswrapper[4822]: I1010 07:35:26.905102 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerDied","Data":"14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007"} Oct 10 07:35:27 crc kubenswrapper[4822]: I1010 07:35:27.038204 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4wpd"] Oct 10 07:35:27 crc kubenswrapper[4822]: I1010 07:35:27.912513 4822 generic.go:334] "Generic (PLEG): container finished" podID="8a016559-c659-48f0-b72f-63a2d344018a" containerID="cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8" exitCode=0 Oct 10 07:35:27 crc kubenswrapper[4822]: I1010 07:35:27.912646 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4wpd" event={"ID":"8a016559-c659-48f0-b72f-63a2d344018a","Type":"ContainerDied","Data":"cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8"} Oct 10 07:35:27 crc kubenswrapper[4822]: I1010 07:35:27.913154 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4wpd" event={"ID":"8a016559-c659-48f0-b72f-63a2d344018a","Type":"ContainerStarted","Data":"73f2eede2fc160595292da5598b84476539442761da74a969841ccd95071e116"} Oct 10 07:35:27 crc kubenswrapper[4822]: I1010 07:35:27.917581 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerStarted","Data":"8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de"} Oct 10 07:35:27 crc kubenswrapper[4822]: I1010 07:35:27.949892 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dgx56" podStartSLOduration=2.418710093 podStartE2EDuration="4.949873742s" podCreationTimestamp="2025-10-10 07:35:23 +0000 UTC" firstStartedPulling="2025-10-10 07:35:24.88686471 +0000 UTC m=+4271.982022906" lastFinishedPulling="2025-10-10 07:35:27.418028329 +0000 UTC m=+4274.513186555" observedRunningTime="2025-10-10 07:35:27.942326485 +0000 UTC m=+4275.037484681" watchObservedRunningTime="2025-10-10 07:35:27.949873742 +0000 UTC m=+4275.045031938" Oct 10 07:35:29 crc kubenswrapper[4822]: I1010 07:35:29.934416 4822 generic.go:334] "Generic (PLEG): container finished" podID="8a016559-c659-48f0-b72f-63a2d344018a" containerID="c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0" exitCode=0 Oct 10 07:35:29 crc kubenswrapper[4822]: I1010 07:35:29.934510 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4wpd" event={"ID":"8a016559-c659-48f0-b72f-63a2d344018a","Type":"ContainerDied","Data":"c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0"} Oct 10 07:35:30 crc kubenswrapper[4822]: I1010 07:35:30.943862 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4wpd" event={"ID":"8a016559-c659-48f0-b72f-63a2d344018a","Type":"ContainerStarted","Data":"02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955"} Oct 10 07:35:30 crc kubenswrapper[4822]: I1010 07:35:30.965913 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4wpd" podStartSLOduration=2.563181722 podStartE2EDuration="4.965894673s" podCreationTimestamp="2025-10-10 07:35:26 +0000 UTC" firstStartedPulling="2025-10-10 07:35:27.913977581 +0000 UTC m=+4275.009135797" lastFinishedPulling="2025-10-10 07:35:30.316690552 +0000 UTC m=+4277.411848748" observedRunningTime="2025-10-10 07:35:30.962221338 +0000 UTC m=+4278.057379544" watchObservedRunningTime="2025-10-10 07:35:30.965894673 +0000 UTC m=+4278.061052869" Oct 10 07:35:31 crc kubenswrapper[4822]: I1010 07:35:31.337226 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:35:31 crc kubenswrapper[4822]: I1010 07:35:31.337308 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:35:34 crc kubenswrapper[4822]: I1010 07:35:34.079733 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:34 crc kubenswrapper[4822]: I1010 07:35:34.080257 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:34 crc kubenswrapper[4822]: I1010 07:35:34.161352 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:35 crc kubenswrapper[4822]: I1010 07:35:35.079937 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:35 crc kubenswrapper[4822]: I1010 07:35:35.647405 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dgx56"] Oct 10 07:35:36 crc kubenswrapper[4822]: I1010 07:35:36.587293 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:36 crc kubenswrapper[4822]: I1010 07:35:36.587639 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:36 crc kubenswrapper[4822]: I1010 07:35:36.655379 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:36 crc kubenswrapper[4822]: I1010 07:35:36.988436 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dgx56" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="registry-server" containerID="cri-o://8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de" gracePeriod=2 Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.032527 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.419111 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.572390 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-utilities\") pod \"8c2c9462-8354-450f-9ad2-43ea510d66a2\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.572557 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-catalog-content\") pod \"8c2c9462-8354-450f-9ad2-43ea510d66a2\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.572610 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48476\" (UniqueName: \"kubernetes.io/projected/8c2c9462-8354-450f-9ad2-43ea510d66a2-kube-api-access-48476\") pod \"8c2c9462-8354-450f-9ad2-43ea510d66a2\" (UID: \"8c2c9462-8354-450f-9ad2-43ea510d66a2\") " Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.573536 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-utilities" (OuterVolumeSpecName: "utilities") pod "8c2c9462-8354-450f-9ad2-43ea510d66a2" (UID: "8c2c9462-8354-450f-9ad2-43ea510d66a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.579478 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2c9462-8354-450f-9ad2-43ea510d66a2-kube-api-access-48476" (OuterVolumeSpecName: "kube-api-access-48476") pod "8c2c9462-8354-450f-9ad2-43ea510d66a2" (UID: "8c2c9462-8354-450f-9ad2-43ea510d66a2"). InnerVolumeSpecName "kube-api-access-48476". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.624753 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c2c9462-8354-450f-9ad2-43ea510d66a2" (UID: "8c2c9462-8354-450f-9ad2-43ea510d66a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.673940 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48476\" (UniqueName: \"kubernetes.io/projected/8c2c9462-8354-450f-9ad2-43ea510d66a2-kube-api-access-48476\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.673990 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.674012 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c9462-8354-450f-9ad2-43ea510d66a2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.997789 4822 generic.go:334] "Generic (PLEG): container finished" podID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerID="8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de" exitCode=0 Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.997880 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgx56" Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.997875 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerDied","Data":"8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de"} Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.998386 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgx56" event={"ID":"8c2c9462-8354-450f-9ad2-43ea510d66a2","Type":"ContainerDied","Data":"a7d3020165a364a7a9bb2a562201fb7f0d2946942b155ba3879586c44e3fac5c"} Oct 10 07:35:37 crc kubenswrapper[4822]: I1010 07:35:37.998446 4822 scope.go:117] "RemoveContainer" containerID="8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.023393 4822 scope.go:117] "RemoveContainer" containerID="14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.025706 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dgx56"] Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.034449 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dgx56"] Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.053783 4822 scope.go:117] "RemoveContainer" containerID="6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.073730 4822 scope.go:117] "RemoveContainer" containerID="8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de" Oct 10 07:35:38 crc kubenswrapper[4822]: E1010 07:35:38.074132 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de\": container with ID starting with 8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de not found: ID does not exist" containerID="8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.074198 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de"} err="failed to get container status \"8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de\": rpc error: code = NotFound desc = could not find container \"8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de\": container with ID starting with 8ab7ad04725da2548d34b52358aa9d06a8ac1f8dbd1d4944b0b17918ae2896de not found: ID does not exist" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.074229 4822 scope.go:117] "RemoveContainer" containerID="14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007" Oct 10 07:35:38 crc kubenswrapper[4822]: E1010 07:35:38.074572 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007\": container with ID starting with 14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007 not found: ID does not exist" containerID="14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.074606 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007"} err="failed to get container status \"14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007\": rpc error: code = NotFound desc = could not find container \"14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007\": container with ID starting with 14a60952d0d2a9822732aaf80cf75b4a1cd4818984e1d66e83a02e2634cba007 not found: ID does not exist" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.074627 4822 scope.go:117] "RemoveContainer" containerID="6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c" Oct 10 07:35:38 crc kubenswrapper[4822]: E1010 07:35:38.074830 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c\": container with ID starting with 6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c not found: ID does not exist" containerID="6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.074852 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c"} err="failed to get container status \"6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c\": rpc error: code = NotFound desc = could not find container \"6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c\": container with ID starting with 6630acd62ca1f38196bd49b162b29ab7894dc8554b720ebfecca27a7be39861c not found: ID does not exist" Oct 10 07:35:38 crc kubenswrapper[4822]: I1010 07:35:38.455103 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4wpd"] Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.007596 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4wpd" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="registry-server" containerID="cri-o://02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955" gracePeriod=2 Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.376612 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.505704 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-catalog-content\") pod \"8a016559-c659-48f0-b72f-63a2d344018a\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.505764 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-utilities\") pod \"8a016559-c659-48f0-b72f-63a2d344018a\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.505792 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8m7n\" (UniqueName: \"kubernetes.io/projected/8a016559-c659-48f0-b72f-63a2d344018a-kube-api-access-t8m7n\") pod \"8a016559-c659-48f0-b72f-63a2d344018a\" (UID: \"8a016559-c659-48f0-b72f-63a2d344018a\") " Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.506678 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-utilities" (OuterVolumeSpecName: "utilities") pod "8a016559-c659-48f0-b72f-63a2d344018a" (UID: "8a016559-c659-48f0-b72f-63a2d344018a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.515494 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a016559-c659-48f0-b72f-63a2d344018a-kube-api-access-t8m7n" (OuterVolumeSpecName: "kube-api-access-t8m7n") pod "8a016559-c659-48f0-b72f-63a2d344018a" (UID: "8a016559-c659-48f0-b72f-63a2d344018a"). InnerVolumeSpecName "kube-api-access-t8m7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.522263 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a016559-c659-48f0-b72f-63a2d344018a" (UID: "8a016559-c659-48f0-b72f-63a2d344018a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.607957 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.607986 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a016559-c659-48f0-b72f-63a2d344018a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.607996 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8m7n\" (UniqueName: \"kubernetes.io/projected/8a016559-c659-48f0-b72f-63a2d344018a-kube-api-access-t8m7n\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:39 crc kubenswrapper[4822]: I1010 07:35:39.659108 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" path="/var/lib/kubelet/pods/8c2c9462-8354-450f-9ad2-43ea510d66a2/volumes" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.021409 4822 generic.go:334] "Generic (PLEG): container finished" podID="8a016559-c659-48f0-b72f-63a2d344018a" containerID="02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955" exitCode=0 Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.021482 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4wpd" event={"ID":"8a016559-c659-48f0-b72f-63a2d344018a","Type":"ContainerDied","Data":"02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955"} Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.021523 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4wpd" event={"ID":"8a016559-c659-48f0-b72f-63a2d344018a","Type":"ContainerDied","Data":"73f2eede2fc160595292da5598b84476539442761da74a969841ccd95071e116"} Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.021547 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4wpd" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.021556 4822 scope.go:117] "RemoveContainer" containerID="02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.055116 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4wpd"] Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.059658 4822 scope.go:117] "RemoveContainer" containerID="c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.064604 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4wpd"] Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.087161 4822 scope.go:117] "RemoveContainer" containerID="cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.117032 4822 scope.go:117] "RemoveContainer" containerID="02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955" Oct 10 07:35:40 crc kubenswrapper[4822]: E1010 07:35:40.117523 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955\": container with ID starting with 02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955 not found: ID does not exist" containerID="02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.117568 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955"} err="failed to get container status \"02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955\": rpc error: code = NotFound desc = could not find container \"02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955\": container with ID starting with 02f7adf353260963d516ef253cc8df01839a11eb28cce19f828eedfb78f11955 not found: ID does not exist" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.117600 4822 scope.go:117] "RemoveContainer" containerID="c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0" Oct 10 07:35:40 crc kubenswrapper[4822]: E1010 07:35:40.118336 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0\": container with ID starting with c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0 not found: ID does not exist" containerID="c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.118437 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0"} err="failed to get container status \"c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0\": rpc error: code = NotFound desc = could not find container \"c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0\": container with ID starting with c9f56793332ce720d0e3dd4e148f9a6fc0e58c7e82a2379fc6b0c4a3d720fdf0 not found: ID does not exist" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.118512 4822 scope.go:117] "RemoveContainer" containerID="cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8" Oct 10 07:35:40 crc kubenswrapper[4822]: E1010 07:35:40.119105 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8\": container with ID starting with cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8 not found: ID does not exist" containerID="cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8" Oct 10 07:35:40 crc kubenswrapper[4822]: I1010 07:35:40.119166 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8"} err="failed to get container status \"cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8\": rpc error: code = NotFound desc = could not find container \"cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8\": container with ID starting with cc70d4d2ab4eb7c7af369c8e8804ddc8e08b4be6e55c963767a5bd111df1bfb8 not found: ID does not exist" Oct 10 07:35:41 crc kubenswrapper[4822]: I1010 07:35:41.661729 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a016559-c659-48f0-b72f-63a2d344018a" path="/var/lib/kubelet/pods/8a016559-c659-48f0-b72f-63a2d344018a/volumes" Oct 10 07:36:01 crc kubenswrapper[4822]: I1010 07:36:01.336405 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:36:01 crc kubenswrapper[4822]: I1010 07:36:01.337036 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:36:01 crc kubenswrapper[4822]: I1010 07:36:01.337078 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:36:01 crc kubenswrapper[4822]: I1010 07:36:01.337676 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:36:01 crc kubenswrapper[4822]: I1010 07:36:01.337730 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" gracePeriod=600 Oct 10 07:36:01 crc kubenswrapper[4822]: E1010 07:36:01.456520 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:36:02 crc kubenswrapper[4822]: I1010 07:36:02.207295 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" exitCode=0 Oct 10 07:36:02 crc kubenswrapper[4822]: I1010 07:36:02.207352 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc"} Oct 10 07:36:02 crc kubenswrapper[4822]: I1010 07:36:02.207403 4822 scope.go:117] "RemoveContainer" containerID="31c516e74088fb46136379a06087e2eaba688082bdf389c465441c4e7785f356" Oct 10 07:36:02 crc kubenswrapper[4822]: I1010 07:36:02.209375 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:36:02 crc kubenswrapper[4822]: E1010 07:36:02.209938 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:36:14 crc kubenswrapper[4822]: I1010 07:36:14.650630 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:36:14 crc kubenswrapper[4822]: E1010 07:36:14.651612 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:36:28 crc kubenswrapper[4822]: I1010 07:36:28.650999 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:36:28 crc kubenswrapper[4822]: E1010 07:36:28.651900 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:36:42 crc kubenswrapper[4822]: I1010 07:36:42.650089 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:36:42 crc kubenswrapper[4822]: E1010 07:36:42.650969 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:36:55 crc kubenswrapper[4822]: I1010 07:36:55.651187 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:36:55 crc kubenswrapper[4822]: E1010 07:36:55.652429 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:37:10 crc kubenswrapper[4822]: I1010 07:37:10.651212 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:37:10 crc kubenswrapper[4822]: E1010 07:37:10.652016 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:37:24 crc kubenswrapper[4822]: I1010 07:37:24.650866 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:37:24 crc kubenswrapper[4822]: E1010 07:37:24.651956 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:37:36 crc kubenswrapper[4822]: I1010 07:37:36.650441 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:37:36 crc kubenswrapper[4822]: E1010 07:37:36.651221 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:37:51 crc kubenswrapper[4822]: I1010 07:37:51.650762 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:37:51 crc kubenswrapper[4822]: E1010 07:37:51.652021 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:38:01 crc kubenswrapper[4822]: I1010 07:38:01.965278 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bwbxr"] Oct 10 07:38:01 crc kubenswrapper[4822]: I1010 07:38:01.973825 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bwbxr"] Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.141914 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tkd99"] Oct 10 07:38:02 crc kubenswrapper[4822]: E1010 07:38:02.142302 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="extract-content" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142329 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="extract-content" Oct 10 07:38:02 crc kubenswrapper[4822]: E1010 07:38:02.142360 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="extract-utilities" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142371 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="extract-utilities" Oct 10 07:38:02 crc kubenswrapper[4822]: E1010 07:38:02.142388 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="registry-server" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142396 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="registry-server" Oct 10 07:38:02 crc kubenswrapper[4822]: E1010 07:38:02.142406 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="registry-server" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142413 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="registry-server" Oct 10 07:38:02 crc kubenswrapper[4822]: E1010 07:38:02.142423 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="extract-utilities" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142432 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="extract-utilities" Oct 10 07:38:02 crc kubenswrapper[4822]: E1010 07:38:02.142448 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="extract-content" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142457 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="extract-content" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142618 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2c9462-8354-450f-9ad2-43ea510d66a2" containerName="registry-server" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.142632 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a016559-c659-48f0-b72f-63a2d344018a" containerName="registry-server" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.143290 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.146848 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.146881 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.148671 4822 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cxqpg" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.150564 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.154626 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tkd99"] Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.220042 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/15a32129-a2a0-4cba-b20b-fd2801219f23-crc-storage\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.220505 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/15a32129-a2a0-4cba-b20b-fd2801219f23-node-mnt\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.220549 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t44c\" (UniqueName: \"kubernetes.io/projected/15a32129-a2a0-4cba-b20b-fd2801219f23-kube-api-access-4t44c\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.322538 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/15a32129-a2a0-4cba-b20b-fd2801219f23-crc-storage\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.322675 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/15a32129-a2a0-4cba-b20b-fd2801219f23-node-mnt\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.322728 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t44c\" (UniqueName: \"kubernetes.io/projected/15a32129-a2a0-4cba-b20b-fd2801219f23-kube-api-access-4t44c\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.323117 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/15a32129-a2a0-4cba-b20b-fd2801219f23-node-mnt\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.323861 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/15a32129-a2a0-4cba-b20b-fd2801219f23-crc-storage\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.342397 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t44c\" (UniqueName: \"kubernetes.io/projected/15a32129-a2a0-4cba-b20b-fd2801219f23-kube-api-access-4t44c\") pod \"crc-storage-crc-tkd99\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.464341 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.949296 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tkd99"] Oct 10 07:38:02 crc kubenswrapper[4822]: I1010 07:38:02.960149 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:38:03 crc kubenswrapper[4822]: I1010 07:38:03.323596 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tkd99" event={"ID":"15a32129-a2a0-4cba-b20b-fd2801219f23","Type":"ContainerStarted","Data":"6e23965acdde3ecf99408d2956bc5e8bb430a23e982c4f964807138683ef639e"} Oct 10 07:38:03 crc kubenswrapper[4822]: I1010 07:38:03.677000 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d7546a4-c45d-41a7-a7eb-24119b7c951a" path="/var/lib/kubelet/pods/8d7546a4-c45d-41a7-a7eb-24119b7c951a/volumes" Oct 10 07:38:06 crc kubenswrapper[4822]: I1010 07:38:06.351903 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tkd99" event={"ID":"15a32129-a2a0-4cba-b20b-fd2801219f23","Type":"ContainerStarted","Data":"bc2bc23f3ce8594ba4a536947b02565e2d300edf4557017316b62e03ed3cf795"} Oct 10 07:38:06 crc kubenswrapper[4822]: I1010 07:38:06.373170 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-tkd99" podStartSLOduration=1.5906579619999999 podStartE2EDuration="4.37314408s" podCreationTimestamp="2025-10-10 07:38:02 +0000 UTC" firstStartedPulling="2025-10-10 07:38:02.959848473 +0000 UTC m=+4430.055006699" lastFinishedPulling="2025-10-10 07:38:05.742334581 +0000 UTC m=+4432.837492817" observedRunningTime="2025-10-10 07:38:06.366342394 +0000 UTC m=+4433.461500600" watchObservedRunningTime="2025-10-10 07:38:06.37314408 +0000 UTC m=+4433.468302306" Oct 10 07:38:06 crc kubenswrapper[4822]: I1010 07:38:06.650706 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:38:06 crc kubenswrapper[4822]: E1010 07:38:06.651135 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:38:07 crc kubenswrapper[4822]: I1010 07:38:07.361735 4822 generic.go:334] "Generic (PLEG): container finished" podID="15a32129-a2a0-4cba-b20b-fd2801219f23" containerID="bc2bc23f3ce8594ba4a536947b02565e2d300edf4557017316b62e03ed3cf795" exitCode=0 Oct 10 07:38:07 crc kubenswrapper[4822]: I1010 07:38:07.361787 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tkd99" event={"ID":"15a32129-a2a0-4cba-b20b-fd2801219f23","Type":"ContainerDied","Data":"bc2bc23f3ce8594ba4a536947b02565e2d300edf4557017316b62e03ed3cf795"} Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.700318 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.733218 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/15a32129-a2a0-4cba-b20b-fd2801219f23-crc-storage\") pod \"15a32129-a2a0-4cba-b20b-fd2801219f23\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.733312 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/15a32129-a2a0-4cba-b20b-fd2801219f23-node-mnt\") pod \"15a32129-a2a0-4cba-b20b-fd2801219f23\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.733454 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t44c\" (UniqueName: \"kubernetes.io/projected/15a32129-a2a0-4cba-b20b-fd2801219f23-kube-api-access-4t44c\") pod \"15a32129-a2a0-4cba-b20b-fd2801219f23\" (UID: \"15a32129-a2a0-4cba-b20b-fd2801219f23\") " Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.735363 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15a32129-a2a0-4cba-b20b-fd2801219f23-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "15a32129-a2a0-4cba-b20b-fd2801219f23" (UID: "15a32129-a2a0-4cba-b20b-fd2801219f23"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.744039 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a32129-a2a0-4cba-b20b-fd2801219f23-kube-api-access-4t44c" (OuterVolumeSpecName: "kube-api-access-4t44c") pod "15a32129-a2a0-4cba-b20b-fd2801219f23" (UID: "15a32129-a2a0-4cba-b20b-fd2801219f23"). InnerVolumeSpecName "kube-api-access-4t44c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.753371 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a32129-a2a0-4cba-b20b-fd2801219f23-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "15a32129-a2a0-4cba-b20b-fd2801219f23" (UID: "15a32129-a2a0-4cba-b20b-fd2801219f23"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.834757 4822 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/15a32129-a2a0-4cba-b20b-fd2801219f23-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.834822 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t44c\" (UniqueName: \"kubernetes.io/projected/15a32129-a2a0-4cba-b20b-fd2801219f23-kube-api-access-4t44c\") on node \"crc\" DevicePath \"\"" Oct 10 07:38:08 crc kubenswrapper[4822]: I1010 07:38:08.834843 4822 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/15a32129-a2a0-4cba-b20b-fd2801219f23-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 10 07:38:09 crc kubenswrapper[4822]: I1010 07:38:09.386956 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tkd99" event={"ID":"15a32129-a2a0-4cba-b20b-fd2801219f23","Type":"ContainerDied","Data":"6e23965acdde3ecf99408d2956bc5e8bb430a23e982c4f964807138683ef639e"} Oct 10 07:38:09 crc kubenswrapper[4822]: I1010 07:38:09.387358 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e23965acdde3ecf99408d2956bc5e8bb430a23e982c4f964807138683ef639e" Oct 10 07:38:09 crc kubenswrapper[4822]: I1010 07:38:09.386997 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tkd99" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.692574 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tkd99"] Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.696964 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tkd99"] Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.809638 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dtwtk"] Oct 10 07:38:10 crc kubenswrapper[4822]: E1010 07:38:10.809907 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a32129-a2a0-4cba-b20b-fd2801219f23" containerName="storage" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.809918 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a32129-a2a0-4cba-b20b-fd2801219f23" containerName="storage" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.810129 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a32129-a2a0-4cba-b20b-fd2801219f23" containerName="storage" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.810567 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.813067 4822 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cxqpg" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.813162 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.813204 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.813304 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.836244 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dtwtk"] Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.963655 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c8f7dbf1-3dca-400a-814d-51e8a42db457-crc-storage\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.963709 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c8f7dbf1-3dca-400a-814d-51e8a42db457-node-mnt\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:10 crc kubenswrapper[4822]: I1010 07:38:10.963924 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p84jj\" (UniqueName: \"kubernetes.io/projected/c8f7dbf1-3dca-400a-814d-51e8a42db457-kube-api-access-p84jj\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.065106 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p84jj\" (UniqueName: \"kubernetes.io/projected/c8f7dbf1-3dca-400a-814d-51e8a42db457-kube-api-access-p84jj\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.065608 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c8f7dbf1-3dca-400a-814d-51e8a42db457-crc-storage\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.067416 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c8f7dbf1-3dca-400a-814d-51e8a42db457-crc-storage\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.067604 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c8f7dbf1-3dca-400a-814d-51e8a42db457-node-mnt\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.068411 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c8f7dbf1-3dca-400a-814d-51e8a42db457-node-mnt\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.098860 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p84jj\" (UniqueName: \"kubernetes.io/projected/c8f7dbf1-3dca-400a-814d-51e8a42db457-kube-api-access-p84jj\") pod \"crc-storage-crc-dtwtk\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.132918 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.646952 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dtwtk"] Oct 10 07:38:11 crc kubenswrapper[4822]: I1010 07:38:11.664878 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a32129-a2a0-4cba-b20b-fd2801219f23" path="/var/lib/kubelet/pods/15a32129-a2a0-4cba-b20b-fd2801219f23/volumes" Oct 10 07:38:12 crc kubenswrapper[4822]: I1010 07:38:12.409920 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dtwtk" event={"ID":"c8f7dbf1-3dca-400a-814d-51e8a42db457","Type":"ContainerStarted","Data":"3f1129f8ab41a929a929f0ef09b9922cad013be3d072bf29e2d39e69513f2512"} Oct 10 07:38:13 crc kubenswrapper[4822]: I1010 07:38:13.431312 4822 generic.go:334] "Generic (PLEG): container finished" podID="c8f7dbf1-3dca-400a-814d-51e8a42db457" containerID="4ce54ea4916ba95cbd44fa36f6ef2352acc8f045b10c2f242850c3d63c4c03e0" exitCode=0 Oct 10 07:38:13 crc kubenswrapper[4822]: I1010 07:38:13.431555 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dtwtk" event={"ID":"c8f7dbf1-3dca-400a-814d-51e8a42db457","Type":"ContainerDied","Data":"4ce54ea4916ba95cbd44fa36f6ef2352acc8f045b10c2f242850c3d63c4c03e0"} Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.777241 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.926322 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c8f7dbf1-3dca-400a-814d-51e8a42db457-crc-storage\") pod \"c8f7dbf1-3dca-400a-814d-51e8a42db457\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.926815 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c8f7dbf1-3dca-400a-814d-51e8a42db457-node-mnt\") pod \"c8f7dbf1-3dca-400a-814d-51e8a42db457\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.926905 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p84jj\" (UniqueName: \"kubernetes.io/projected/c8f7dbf1-3dca-400a-814d-51e8a42db457-kube-api-access-p84jj\") pod \"c8f7dbf1-3dca-400a-814d-51e8a42db457\" (UID: \"c8f7dbf1-3dca-400a-814d-51e8a42db457\") " Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.926978 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8f7dbf1-3dca-400a-814d-51e8a42db457-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c8f7dbf1-3dca-400a-814d-51e8a42db457" (UID: "c8f7dbf1-3dca-400a-814d-51e8a42db457"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.927274 4822 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c8f7dbf1-3dca-400a-814d-51e8a42db457-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.931139 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f7dbf1-3dca-400a-814d-51e8a42db457-kube-api-access-p84jj" (OuterVolumeSpecName: "kube-api-access-p84jj") pod "c8f7dbf1-3dca-400a-814d-51e8a42db457" (UID: "c8f7dbf1-3dca-400a-814d-51e8a42db457"). InnerVolumeSpecName "kube-api-access-p84jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:38:14 crc kubenswrapper[4822]: I1010 07:38:14.942216 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8f7dbf1-3dca-400a-814d-51e8a42db457-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c8f7dbf1-3dca-400a-814d-51e8a42db457" (UID: "c8f7dbf1-3dca-400a-814d-51e8a42db457"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:38:15 crc kubenswrapper[4822]: I1010 07:38:15.028525 4822 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c8f7dbf1-3dca-400a-814d-51e8a42db457-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 10 07:38:15 crc kubenswrapper[4822]: I1010 07:38:15.028586 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p84jj\" (UniqueName: \"kubernetes.io/projected/c8f7dbf1-3dca-400a-814d-51e8a42db457-kube-api-access-p84jj\") on node \"crc\" DevicePath \"\"" Oct 10 07:38:15 crc kubenswrapper[4822]: I1010 07:38:15.453721 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dtwtk" event={"ID":"c8f7dbf1-3dca-400a-814d-51e8a42db457","Type":"ContainerDied","Data":"3f1129f8ab41a929a929f0ef09b9922cad013be3d072bf29e2d39e69513f2512"} Oct 10 07:38:15 crc kubenswrapper[4822]: I1010 07:38:15.453771 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f1129f8ab41a929a929f0ef09b9922cad013be3d072bf29e2d39e69513f2512" Oct 10 07:38:15 crc kubenswrapper[4822]: I1010 07:38:15.453864 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dtwtk" Oct 10 07:38:19 crc kubenswrapper[4822]: I1010 07:38:19.656082 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:38:19 crc kubenswrapper[4822]: E1010 07:38:19.657708 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:38:22 crc kubenswrapper[4822]: I1010 07:38:22.329272 4822 scope.go:117] "RemoveContainer" containerID="bd310d46d1233151146148c4bd637d08c61c1dc6f4a01fee315ce12118ba52c1" Oct 10 07:38:34 crc kubenswrapper[4822]: I1010 07:38:34.650539 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:38:34 crc kubenswrapper[4822]: E1010 07:38:34.651576 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:38:49 crc kubenswrapper[4822]: I1010 07:38:49.651421 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:38:49 crc kubenswrapper[4822]: E1010 07:38:49.652161 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:39:01 crc kubenswrapper[4822]: I1010 07:39:01.650714 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:39:01 crc kubenswrapper[4822]: E1010 07:39:01.651721 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:39:12 crc kubenswrapper[4822]: I1010 07:39:12.650863 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:39:12 crc kubenswrapper[4822]: E1010 07:39:12.652177 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:39:27 crc kubenswrapper[4822]: I1010 07:39:27.649981 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:39:27 crc kubenswrapper[4822]: E1010 07:39:27.650724 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:39:39 crc kubenswrapper[4822]: I1010 07:39:39.650102 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:39:39 crc kubenswrapper[4822]: E1010 07:39:39.650744 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:39:52 crc kubenswrapper[4822]: I1010 07:39:52.651266 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:39:52 crc kubenswrapper[4822]: E1010 07:39:52.652399 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:40:05 crc kubenswrapper[4822]: I1010 07:40:05.650263 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:40:05 crc kubenswrapper[4822]: E1010 07:40:05.651283 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:40:19 crc kubenswrapper[4822]: I1010 07:40:19.650767 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:40:19 crc kubenswrapper[4822]: E1010 07:40:19.651926 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:40:30 crc kubenswrapper[4822]: I1010 07:40:30.650628 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:40:30 crc kubenswrapper[4822]: E1010 07:40:30.653143 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:40:43 crc kubenswrapper[4822]: I1010 07:40:43.662073 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:40:43 crc kubenswrapper[4822]: E1010 07:40:43.663307 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:40:54 crc kubenswrapper[4822]: I1010 07:40:54.650065 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:40:54 crc kubenswrapper[4822]: E1010 07:40:54.652505 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:41:06 crc kubenswrapper[4822]: I1010 07:41:06.651024 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:41:06 crc kubenswrapper[4822]: I1010 07:41:06.950876 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"2398a8e542386da14b10b31ffe04ff05e312cb55595e888f7591cd86fcaa9dae"} Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.483697 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wc46g"] Oct 10 07:41:15 crc kubenswrapper[4822]: E1010 07:41:15.484992 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f7dbf1-3dca-400a-814d-51e8a42db457" containerName="storage" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.485016 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f7dbf1-3dca-400a-814d-51e8a42db457" containerName="storage" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.485310 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f7dbf1-3dca-400a-814d-51e8a42db457" containerName="storage" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.487243 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.504939 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc46g"] Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.582086 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55f55\" (UniqueName: \"kubernetes.io/projected/f599a9e7-b6b8-4a05-8acf-545095921c0a-kube-api-access-55f55\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.582172 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-catalog-content\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.582236 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-utilities\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.684083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-utilities\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.684176 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55f55\" (UniqueName: \"kubernetes.io/projected/f599a9e7-b6b8-4a05-8acf-545095921c0a-kube-api-access-55f55\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.684224 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-catalog-content\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.684778 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-catalog-content\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.685012 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-utilities\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.719385 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55f55\" (UniqueName: \"kubernetes.io/projected/f599a9e7-b6b8-4a05-8acf-545095921c0a-kube-api-access-55f55\") pod \"community-operators-wc46g\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:15 crc kubenswrapper[4822]: I1010 07:41:15.821138 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:16 crc kubenswrapper[4822]: I1010 07:41:16.126208 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc46g"] Oct 10 07:41:17 crc kubenswrapper[4822]: I1010 07:41:17.069397 4822 generic.go:334] "Generic (PLEG): container finished" podID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerID="b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0" exitCode=0 Oct 10 07:41:17 crc kubenswrapper[4822]: I1010 07:41:17.069491 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerDied","Data":"b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0"} Oct 10 07:41:17 crc kubenswrapper[4822]: I1010 07:41:17.069685 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerStarted","Data":"a4d62364e432236e8c39e98a4b116a18e713324b28f9af270de8632c9cc7b219"} Oct 10 07:41:18 crc kubenswrapper[4822]: I1010 07:41:18.083234 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerStarted","Data":"0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe"} Oct 10 07:41:19 crc kubenswrapper[4822]: I1010 07:41:19.098194 4822 generic.go:334] "Generic (PLEG): container finished" podID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerID="0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe" exitCode=0 Oct 10 07:41:19 crc kubenswrapper[4822]: I1010 07:41:19.098261 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerDied","Data":"0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe"} Oct 10 07:41:20 crc kubenswrapper[4822]: I1010 07:41:20.107768 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerStarted","Data":"9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9"} Oct 10 07:41:20 crc kubenswrapper[4822]: I1010 07:41:20.135859 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wc46g" podStartSLOduration=2.652246645 podStartE2EDuration="5.135840026s" podCreationTimestamp="2025-10-10 07:41:15 +0000 UTC" firstStartedPulling="2025-10-10 07:41:17.071987265 +0000 UTC m=+4624.167145501" lastFinishedPulling="2025-10-10 07:41:19.555580656 +0000 UTC m=+4626.650738882" observedRunningTime="2025-10-10 07:41:20.12975674 +0000 UTC m=+4627.224914946" watchObservedRunningTime="2025-10-10 07:41:20.135840026 +0000 UTC m=+4627.230998232" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.783209 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lq8nx"] Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.784934 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.786590 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.787468 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.787531 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-q95z6" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.787569 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.787790 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.790873 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lq8nx"] Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.858759 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.858825 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-config\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.858886 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmj7b\" (UniqueName: \"kubernetes.io/projected/91388f2d-2b39-4988-a4b2-943db1e2da06-kube-api-access-dmj7b\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.959555 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-config\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.959634 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmj7b\" (UniqueName: \"kubernetes.io/projected/91388f2d-2b39-4988-a4b2-943db1e2da06-kube-api-access-dmj7b\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.959684 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.960704 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.960908 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-config\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:24 crc kubenswrapper[4822]: I1010 07:41:24.996979 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmj7b\" (UniqueName: \"kubernetes.io/projected/91388f2d-2b39-4988-a4b2-943db1e2da06-kube-api-access-dmj7b\") pod \"dnsmasq-dns-5d7b5456f5-lq8nx\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.064895 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vdm6z"] Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.072267 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.078657 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vdm6z"] Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.105450 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.175464 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.175584 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-config\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.175617 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/5391ed5a-94fe-42ec-9c74-034697b6950f-kube-api-access-zrsn8\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.276725 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-config\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.276784 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/5391ed5a-94fe-42ec-9c74-034697b6950f-kube-api-access-zrsn8\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.277053 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.278367 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-config\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.278482 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.300470 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/5391ed5a-94fe-42ec-9c74-034697b6950f-kube-api-access-zrsn8\") pod \"dnsmasq-dns-98ddfc8f-vdm6z\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.397156 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.413581 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lq8nx"] Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.822303 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.822634 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.859480 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vdm6z"] Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.885587 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.935257 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.936410 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.941182 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.941673 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.942085 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.941842 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sv4bl" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.942330 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.946863 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993031 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61a46496-ab17-4e29-a6df-ad6342f79139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993084 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993129 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993148 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993168 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8g6h\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-kube-api-access-q8g6h\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993187 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61a46496-ab17-4e29-a6df-ad6342f79139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993342 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993434 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:25 crc kubenswrapper[4822]: I1010 07:41:25.993582 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.094934 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.094978 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095003 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8g6h\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-kube-api-access-q8g6h\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095021 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61a46496-ab17-4e29-a6df-ad6342f79139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095057 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095074 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095119 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095144 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61a46496-ab17-4e29-a6df-ad6342f79139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095166 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.095974 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.096514 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.096604 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.097432 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.100533 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61a46496-ab17-4e29-a6df-ad6342f79139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.100533 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.102202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61a46496-ab17-4e29-a6df-ad6342f79139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.110530 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.110559 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7e19833da2871e608863fda1c55a3dff7d80fba167ff1bbc70233ad2f6e9936b/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.114319 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8g6h\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-kube-api-access-q8g6h\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.141276 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.161082 4822 generic.go:334] "Generic (PLEG): container finished" podID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerID="aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b" exitCode=0 Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.161181 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" event={"ID":"91388f2d-2b39-4988-a4b2-943db1e2da06","Type":"ContainerDied","Data":"aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b"} Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.161224 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" event={"ID":"91388f2d-2b39-4988-a4b2-943db1e2da06","Type":"ContainerStarted","Data":"3bdc717937e7725689aa598c6dd3a80e1935da1c77a1b4ad2f503bb7d9f83235"} Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.162944 4822 generic.go:334] "Generic (PLEG): container finished" podID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerID="6dcc46307d77a1f15c0928a4eabe9f7ded3784f4af62beb4cd556c2205346219" exitCode=0 Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.163046 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" event={"ID":"5391ed5a-94fe-42ec-9c74-034697b6950f","Type":"ContainerDied","Data":"6dcc46307d77a1f15c0928a4eabe9f7ded3784f4af62beb4cd556c2205346219"} Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.163093 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" event={"ID":"5391ed5a-94fe-42ec-9c74-034697b6950f","Type":"ContainerStarted","Data":"e53a2dca3627f32f7021378743a2a4cb5d9d4e448fe5fda25b3ccad821091c3f"} Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.244736 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.255259 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.257056 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.258546 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.258718 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.258867 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jpb2q" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.259025 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.259349 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.267906 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.286419 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305351 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305389 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305420 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305448 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305491 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dab00d16-02d5-42a5-95fc-88802736edf2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305540 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dab00d16-02d5-42a5-95fc-88802736edf2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305575 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305612 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wj77\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-kube-api-access-6wj77\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.305635 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.322466 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc46g"] Oct 10 07:41:26 crc kubenswrapper[4822]: E1010 07:41:26.350666 4822 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 10 07:41:26 crc kubenswrapper[4822]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/91388f2d-2b39-4988-a4b2-943db1e2da06/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 10 07:41:26 crc kubenswrapper[4822]: > podSandboxID="3bdc717937e7725689aa598c6dd3a80e1935da1c77a1b4ad2f503bb7d9f83235" Oct 10 07:41:26 crc kubenswrapper[4822]: E1010 07:41:26.350807 4822 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 10 07:41:26 crc kubenswrapper[4822]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmj7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-lq8nx_openstack(91388f2d-2b39-4988-a4b2-943db1e2da06): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/91388f2d-2b39-4988-a4b2-943db1e2da06/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 10 07:41:26 crc kubenswrapper[4822]: > logger="UnhandledError" Oct 10 07:41:26 crc kubenswrapper[4822]: E1010 07:41:26.352364 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/91388f2d-2b39-4988-a4b2-943db1e2da06/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407704 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wj77\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-kube-api-access-6wj77\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407750 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407795 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407829 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407851 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407876 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407908 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dab00d16-02d5-42a5-95fc-88802736edf2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407939 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dab00d16-02d5-42a5-95fc-88802736edf2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.407965 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.408420 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.408441 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.409058 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.410033 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.410673 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.410703 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a75def2ebe05f400a36c3ff2d587ad52f06bc348c19f821d527d36d8a0b81552/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.413400 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.413397 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dab00d16-02d5-42a5-95fc-88802736edf2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.415029 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dab00d16-02d5-42a5-95fc-88802736edf2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.423826 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wj77\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-kube-api-access-6wj77\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.442258 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.596359 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.730836 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.862784 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.863976 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.865587 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.867244 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.867621 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-sxrsj" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.867640 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.867729 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.880853 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.882891 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914097 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cbd2b8da-ad61-4a22-be7c-5639531463de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914174 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-kolla-config\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914277 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-577th\" (UniqueName: \"kubernetes.io/projected/cbd2b8da-ad61-4a22-be7c-5639531463de-kube-api-access-577th\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914349 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914387 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914413 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-747deade-f7d0-46b5-9989-7e478553a27d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-747deade-f7d0-46b5-9989-7e478553a27d\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914440 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914609 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-config-data-default\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:26 crc kubenswrapper[4822]: I1010 07:41:26.914927 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-secrets\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.016751 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-kolla-config\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.016915 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-577th\" (UniqueName: \"kubernetes.io/projected/cbd2b8da-ad61-4a22-be7c-5639531463de-kube-api-access-577th\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.016999 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.017063 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.017121 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-747deade-f7d0-46b5-9989-7e478553a27d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-747deade-f7d0-46b5-9989-7e478553a27d\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.017183 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.017235 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-config-data-default\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.017364 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-secrets\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.017417 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cbd2b8da-ad61-4a22-be7c-5639531463de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.018945 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cbd2b8da-ad61-4a22-be7c-5639531463de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.019116 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-config-data-default\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.019854 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.021150 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cbd2b8da-ad61-4a22-be7c-5639531463de-kolla-config\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.021665 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-secrets\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.023459 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.024419 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.024449 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-747deade-f7d0-46b5-9989-7e478553a27d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-747deade-f7d0-46b5-9989-7e478553a27d\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8816706a622c4a1d0134fbbca63eed402e58e688e8af64161cf7ccc2164cc331/globalmount\"" pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.033527 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd2b8da-ad61-4a22-be7c-5639531463de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.037617 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-577th\" (UniqueName: \"kubernetes.io/projected/cbd2b8da-ad61-4a22-be7c-5639531463de-kube-api-access-577th\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.065912 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-747deade-f7d0-46b5-9989-7e478553a27d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-747deade-f7d0-46b5-9989-7e478553a27d\") pod \"openstack-galera-0\" (UID: \"cbd2b8da-ad61-4a22-be7c-5639531463de\") " pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.079700 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:41:27 crc kubenswrapper[4822]: W1010 07:41:27.099266 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddab00d16_02d5_42a5_95fc_88802736edf2.slice/crio-bf3b5b5a6c73eee06e66c5fefc52216eee2444963e856d65ce2f72f083c68996 WatchSource:0}: Error finding container bf3b5b5a6c73eee06e66c5fefc52216eee2444963e856d65ce2f72f083c68996: Status 404 returned error can't find the container with id bf3b5b5a6c73eee06e66c5fefc52216eee2444963e856d65ce2f72f083c68996 Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.176343 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" event={"ID":"5391ed5a-94fe-42ec-9c74-034697b6950f","Type":"ContainerStarted","Data":"3e47d42ea7a8a1c83d3749a5f423c560d2632c77e57662e21e61300f45cf9902"} Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.176627 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.179700 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dab00d16-02d5-42a5-95fc-88802736edf2","Type":"ContainerStarted","Data":"bf3b5b5a6c73eee06e66c5fefc52216eee2444963e856d65ce2f72f083c68996"} Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.180728 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.182563 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61a46496-ab17-4e29-a6df-ad6342f79139","Type":"ContainerStarted","Data":"b202148f7cd6b3e5b3ed171b3703ab43c71931189b2baa11cbef4597210e7b24"} Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.207195 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" podStartSLOduration=2.207176695 podStartE2EDuration="2.207176695s" podCreationTimestamp="2025-10-10 07:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:41:27.204346714 +0000 UTC m=+4634.299504940" watchObservedRunningTime="2025-10-10 07:41:27.207176695 +0000 UTC m=+4634.302334891" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.455667 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.457839 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.459771 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dn4lz" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.460462 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.474719 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.524863 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661709bf-a225-4ecb-a2ce-245c8dd7af77-config-data\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.524906 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzrx\" (UniqueName: \"kubernetes.io/projected/661709bf-a225-4ecb-a2ce-245c8dd7af77-kube-api-access-8hzrx\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.524947 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/661709bf-a225-4ecb-a2ce-245c8dd7af77-kolla-config\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.623822 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.625591 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzrx\" (UniqueName: \"kubernetes.io/projected/661709bf-a225-4ecb-a2ce-245c8dd7af77-kube-api-access-8hzrx\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.625644 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/661709bf-a225-4ecb-a2ce-245c8dd7af77-kolla-config\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.625714 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661709bf-a225-4ecb-a2ce-245c8dd7af77-config-data\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.626529 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661709bf-a225-4ecb-a2ce-245c8dd7af77-config-data\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.627197 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/661709bf-a225-4ecb-a2ce-245c8dd7af77-kolla-config\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.670375 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzrx\" (UniqueName: \"kubernetes.io/projected/661709bf-a225-4ecb-a2ce-245c8dd7af77-kube-api-access-8hzrx\") pod \"memcached-0\" (UID: \"661709bf-a225-4ecb-a2ce-245c8dd7af77\") " pod="openstack/memcached-0" Oct 10 07:41:27 crc kubenswrapper[4822]: I1010 07:41:27.775888 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.194886 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61a46496-ab17-4e29-a6df-ad6342f79139","Type":"ContainerStarted","Data":"f416e7ec2a96c6d518e22f30c62d024fc2a5e419ae8f59d187b9933c13ed0023"} Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.198458 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cbd2b8da-ad61-4a22-be7c-5639531463de","Type":"ContainerStarted","Data":"9566383d83e2712477aafd7fc3ebcede30f94e3a34b9afd6bb2e18e856199c71"} Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.198496 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cbd2b8da-ad61-4a22-be7c-5639531463de","Type":"ContainerStarted","Data":"7e7ee5efc25dc35c8a7b2e58b918fbe50fe2d3aa9fa141e05f50a383acb8c373"} Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.201447 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" event={"ID":"91388f2d-2b39-4988-a4b2-943db1e2da06","Type":"ContainerStarted","Data":"705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a"} Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.201938 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.202233 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wc46g" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="registry-server" containerID="cri-o://9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9" gracePeriod=2 Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.243033 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.287668 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" podStartSLOduration=4.287647524 podStartE2EDuration="4.287647524s" podCreationTimestamp="2025-10-10 07:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:41:28.282072453 +0000 UTC m=+4635.377230659" watchObservedRunningTime="2025-10-10 07:41:28.287647524 +0000 UTC m=+4635.382805730" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.605595 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.607604 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.609512 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.621729 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2jk96" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.622058 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.622142 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.630760 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.678017 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.750766 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.750858 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctwj\" (UniqueName: \"kubernetes.io/projected/123c29f8-7721-41ab-81f1-887769d9e1c2-kube-api-access-dctwj\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.750905 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.750935 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.750971 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.750993 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.751014 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/123c29f8-7721-41ab-81f1-887769d9e1c2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.751241 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.751324 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.852880 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-utilities\") pod \"f599a9e7-b6b8-4a05-8acf-545095921c0a\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.852979 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55f55\" (UniqueName: \"kubernetes.io/projected/f599a9e7-b6b8-4a05-8acf-545095921c0a-kube-api-access-55f55\") pod \"f599a9e7-b6b8-4a05-8acf-545095921c0a\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853255 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-catalog-content\") pod \"f599a9e7-b6b8-4a05-8acf-545095921c0a\" (UID: \"f599a9e7-b6b8-4a05-8acf-545095921c0a\") " Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853588 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853666 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853754 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853826 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-utilities" (OuterVolumeSpecName: "utilities") pod "f599a9e7-b6b8-4a05-8acf-545095921c0a" (UID: "f599a9e7-b6b8-4a05-8acf-545095921c0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853820 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.853941 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/123c29f8-7721-41ab-81f1-887769d9e1c2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.854041 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.854074 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.854229 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.854275 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctwj\" (UniqueName: \"kubernetes.io/projected/123c29f8-7721-41ab-81f1-887769d9e1c2-kube-api-access-dctwj\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.854404 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.854482 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.855249 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/123c29f8-7721-41ab-81f1-887769d9e1c2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.855925 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.856761 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.857201 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/691d34cc7611a1d6944a03d12c43e754711c478b90d188697b2e060d100953cf/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.857394 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/123c29f8-7721-41ab-81f1-887769d9e1c2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.858795 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f599a9e7-b6b8-4a05-8acf-545095921c0a-kube-api-access-55f55" (OuterVolumeSpecName: "kube-api-access-55f55") pod "f599a9e7-b6b8-4a05-8acf-545095921c0a" (UID: "f599a9e7-b6b8-4a05-8acf-545095921c0a"). InnerVolumeSpecName "kube-api-access-55f55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.860096 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.860414 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.870712 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123c29f8-7721-41ab-81f1-887769d9e1c2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.883680 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctwj\" (UniqueName: \"kubernetes.io/projected/123c29f8-7721-41ab-81f1-887769d9e1c2-kube-api-access-dctwj\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.917497 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c1e11d0-0407-4115-850a-39f7e7c506e2\") pod \"openstack-cell1-galera-0\" (UID: \"123c29f8-7721-41ab-81f1-887769d9e1c2\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.956368 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55f55\" (UniqueName: \"kubernetes.io/projected/f599a9e7-b6b8-4a05-8acf-545095921c0a-kube-api-access-55f55\") on node \"crc\" DevicePath \"\"" Oct 10 07:41:28 crc kubenswrapper[4822]: I1010 07:41:28.994713 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.105631 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f599a9e7-b6b8-4a05-8acf-545095921c0a" (UID: "f599a9e7-b6b8-4a05-8acf-545095921c0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.158854 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f599a9e7-b6b8-4a05-8acf-545095921c0a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.213847 4822 generic.go:334] "Generic (PLEG): container finished" podID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerID="9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9" exitCode=0 Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.213905 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerDied","Data":"9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9"} Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.213937 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc46g" event={"ID":"f599a9e7-b6b8-4a05-8acf-545095921c0a","Type":"ContainerDied","Data":"a4d62364e432236e8c39e98a4b116a18e713324b28f9af270de8632c9cc7b219"} Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.213957 4822 scope.go:117] "RemoveContainer" containerID="9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.214074 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc46g" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.226092 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"661709bf-a225-4ecb-a2ce-245c8dd7af77","Type":"ContainerStarted","Data":"b4bb93642ad03cf2e208aaa9076d037b4c70a6d8571754f3dab5d01db88d569e"} Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.226129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"661709bf-a225-4ecb-a2ce-245c8dd7af77","Type":"ContainerStarted","Data":"6646694d1ee8c0a6a930cfb6fa8aaba3924d1692e7c7ad183bca2a4d3086359d"} Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.226358 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.229521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dab00d16-02d5-42a5-95fc-88802736edf2","Type":"ContainerStarted","Data":"53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582"} Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.232866 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.239159 4822 scope.go:117] "RemoveContainer" containerID="0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.246803 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.246783461 podStartE2EDuration="2.246783461s" podCreationTimestamp="2025-10-10 07:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:41:29.245308668 +0000 UTC m=+4636.340466864" watchObservedRunningTime="2025-10-10 07:41:29.246783461 +0000 UTC m=+4636.341941677" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.281176 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc46g"] Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.288562 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wc46g"] Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.309345 4822 scope.go:117] "RemoveContainer" containerID="b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.351054 4822 scope.go:117] "RemoveContainer" containerID="9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9" Oct 10 07:41:29 crc kubenswrapper[4822]: E1010 07:41:29.351502 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9\": container with ID starting with 9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9 not found: ID does not exist" containerID="9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.351533 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9"} err="failed to get container status \"9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9\": rpc error: code = NotFound desc = could not find container \"9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9\": container with ID starting with 9d517b55fb3c6b5b3ceae82db662386034df14ae2eced0948d6aaf6b590f8ca9 not found: ID does not exist" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.351555 4822 scope.go:117] "RemoveContainer" containerID="0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe" Oct 10 07:41:29 crc kubenswrapper[4822]: E1010 07:41:29.351795 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe\": container with ID starting with 0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe not found: ID does not exist" containerID="0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.351850 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe"} err="failed to get container status \"0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe\": rpc error: code = NotFound desc = could not find container \"0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe\": container with ID starting with 0e5d3a9e3ec477c9cd39d6daab52564bba5376bddd5d3db3097afc374dee05fe not found: ID does not exist" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.351870 4822 scope.go:117] "RemoveContainer" containerID="b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0" Oct 10 07:41:29 crc kubenswrapper[4822]: E1010 07:41:29.352223 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0\": container with ID starting with b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0 not found: ID does not exist" containerID="b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.352248 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0"} err="failed to get container status \"b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0\": rpc error: code = NotFound desc = could not find container \"b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0\": container with ID starting with b5364cbe06471cdba1c0c3b8ca7837aba77eb2df9cd13359b8c075cbdf85d4e0 not found: ID does not exist" Oct 10 07:41:29 crc kubenswrapper[4822]: I1010 07:41:29.662794 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" path="/var/lib/kubelet/pods/f599a9e7-b6b8-4a05-8acf-545095921c0a/volumes" Oct 10 07:41:30 crc kubenswrapper[4822]: I1010 07:41:30.241641 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"123c29f8-7721-41ab-81f1-887769d9e1c2","Type":"ContainerStarted","Data":"a8a4088b562326cca62658bb2501a0448cf3c265ace44d481be0e8679ab1128f"} Oct 10 07:41:30 crc kubenswrapper[4822]: I1010 07:41:30.242063 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"123c29f8-7721-41ab-81f1-887769d9e1c2","Type":"ContainerStarted","Data":"fe41241753304fec837bfa6d9b2d27faae11e2da58762dc22f9fa672114b9b77"} Oct 10 07:41:32 crc kubenswrapper[4822]: I1010 07:41:32.263713 4822 generic.go:334] "Generic (PLEG): container finished" podID="cbd2b8da-ad61-4a22-be7c-5639531463de" containerID="9566383d83e2712477aafd7fc3ebcede30f94e3a34b9afd6bb2e18e856199c71" exitCode=0 Oct 10 07:41:32 crc kubenswrapper[4822]: I1010 07:41:32.263775 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cbd2b8da-ad61-4a22-be7c-5639531463de","Type":"ContainerDied","Data":"9566383d83e2712477aafd7fc3ebcede30f94e3a34b9afd6bb2e18e856199c71"} Oct 10 07:41:33 crc kubenswrapper[4822]: I1010 07:41:33.281970 4822 generic.go:334] "Generic (PLEG): container finished" podID="123c29f8-7721-41ab-81f1-887769d9e1c2" containerID="a8a4088b562326cca62658bb2501a0448cf3c265ace44d481be0e8679ab1128f" exitCode=0 Oct 10 07:41:33 crc kubenswrapper[4822]: I1010 07:41:33.282074 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"123c29f8-7721-41ab-81f1-887769d9e1c2","Type":"ContainerDied","Data":"a8a4088b562326cca62658bb2501a0448cf3c265ace44d481be0e8679ab1128f"} Oct 10 07:41:33 crc kubenswrapper[4822]: I1010 07:41:33.289919 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"cbd2b8da-ad61-4a22-be7c-5639531463de","Type":"ContainerStarted","Data":"05968e1ec8d0aa38f746650886432c59fc099ebd15871fb1c6669b25aa7d470b"} Oct 10 07:41:33 crc kubenswrapper[4822]: I1010 07:41:33.353838 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.353783632 podStartE2EDuration="8.353783632s" podCreationTimestamp="2025-10-10 07:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:41:33.351933499 +0000 UTC m=+4640.447091715" watchObservedRunningTime="2025-10-10 07:41:33.353783632 +0000 UTC m=+4640.448941858" Oct 10 07:41:34 crc kubenswrapper[4822]: I1010 07:41:34.303018 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"123c29f8-7721-41ab-81f1-887769d9e1c2","Type":"ContainerStarted","Data":"0ef60574fad7484cdf908126e624337ba72a774d8a371b6efbf564ef6be85ae9"} Oct 10 07:41:34 crc kubenswrapper[4822]: I1010 07:41:34.347533 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.347501537 podStartE2EDuration="7.347501537s" podCreationTimestamp="2025-10-10 07:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:41:34.340181396 +0000 UTC m=+4641.435339672" watchObservedRunningTime="2025-10-10 07:41:34.347501537 +0000 UTC m=+4641.442659763" Oct 10 07:41:35 crc kubenswrapper[4822]: I1010 07:41:35.107012 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:35 crc kubenswrapper[4822]: I1010 07:41:35.399708 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:41:35 crc kubenswrapper[4822]: I1010 07:41:35.470577 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lq8nx"] Oct 10 07:41:35 crc kubenswrapper[4822]: I1010 07:41:35.473311 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerName="dnsmasq-dns" containerID="cri-o://705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a" gracePeriod=10 Oct 10 07:41:35 crc kubenswrapper[4822]: I1010 07:41:35.914954 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.091383 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-dns-svc\") pod \"91388f2d-2b39-4988-a4b2-943db1e2da06\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.091439 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmj7b\" (UniqueName: \"kubernetes.io/projected/91388f2d-2b39-4988-a4b2-943db1e2da06-kube-api-access-dmj7b\") pod \"91388f2d-2b39-4988-a4b2-943db1e2da06\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.091507 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-config\") pod \"91388f2d-2b39-4988-a4b2-943db1e2da06\" (UID: \"91388f2d-2b39-4988-a4b2-943db1e2da06\") " Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.097067 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91388f2d-2b39-4988-a4b2-943db1e2da06-kube-api-access-dmj7b" (OuterVolumeSpecName: "kube-api-access-dmj7b") pod "91388f2d-2b39-4988-a4b2-943db1e2da06" (UID: "91388f2d-2b39-4988-a4b2-943db1e2da06"). InnerVolumeSpecName "kube-api-access-dmj7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.138540 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-config" (OuterVolumeSpecName: "config") pod "91388f2d-2b39-4988-a4b2-943db1e2da06" (UID: "91388f2d-2b39-4988-a4b2-943db1e2da06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.139038 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91388f2d-2b39-4988-a4b2-943db1e2da06" (UID: "91388f2d-2b39-4988-a4b2-943db1e2da06"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.193581 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.193622 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91388f2d-2b39-4988-a4b2-943db1e2da06-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.193638 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmj7b\" (UniqueName: \"kubernetes.io/projected/91388f2d-2b39-4988-a4b2-943db1e2da06-kube-api-access-dmj7b\") on node \"crc\" DevicePath \"\"" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.323320 4822 generic.go:334] "Generic (PLEG): container finished" podID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerID="705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a" exitCode=0 Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.323384 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.323385 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" event={"ID":"91388f2d-2b39-4988-a4b2-943db1e2da06","Type":"ContainerDied","Data":"705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a"} Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.323563 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-lq8nx" event={"ID":"91388f2d-2b39-4988-a4b2-943db1e2da06","Type":"ContainerDied","Data":"3bdc717937e7725689aa598c6dd3a80e1935da1c77a1b4ad2f503bb7d9f83235"} Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.323625 4822 scope.go:117] "RemoveContainer" containerID="705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.354553 4822 scope.go:117] "RemoveContainer" containerID="aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.374555 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lq8nx"] Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.384280 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-lq8nx"] Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.393641 4822 scope.go:117] "RemoveContainer" containerID="705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a" Oct 10 07:41:36 crc kubenswrapper[4822]: E1010 07:41:36.394255 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a\": container with ID starting with 705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a not found: ID does not exist" containerID="705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.394308 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a"} err="failed to get container status \"705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a\": rpc error: code = NotFound desc = could not find container \"705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a\": container with ID starting with 705d51d80fcfe63a6003e65332f24ab630d4b56bf0a6d32db24dd2758f155d4a not found: ID does not exist" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.394343 4822 scope.go:117] "RemoveContainer" containerID="aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b" Oct 10 07:41:36 crc kubenswrapper[4822]: E1010 07:41:36.395391 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b\": container with ID starting with aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b not found: ID does not exist" containerID="aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b" Oct 10 07:41:36 crc kubenswrapper[4822]: I1010 07:41:36.395442 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b"} err="failed to get container status \"aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b\": rpc error: code = NotFound desc = could not find container \"aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b\": container with ID starting with aad84788ffb633ac2b305711e866828ce2780c75f10b0e417dacbd25a8dca45b not found: ID does not exist" Oct 10 07:41:36 crc kubenswrapper[4822]: E1010 07:41:36.470354 4822 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.180:49132->38.102.83.180:44473: write tcp 38.102.83.180:49132->38.102.83.180:44473: write: broken pipe Oct 10 07:41:37 crc kubenswrapper[4822]: I1010 07:41:37.181797 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 10 07:41:37 crc kubenswrapper[4822]: I1010 07:41:37.181893 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 10 07:41:37 crc kubenswrapper[4822]: I1010 07:41:37.241552 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 10 07:41:37 crc kubenswrapper[4822]: I1010 07:41:37.395556 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 10 07:41:37 crc kubenswrapper[4822]: I1010 07:41:37.663189 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" path="/var/lib/kubelet/pods/91388f2d-2b39-4988-a4b2-943db1e2da06/volumes" Oct 10 07:41:37 crc kubenswrapper[4822]: I1010 07:41:37.777149 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 10 07:41:38 crc kubenswrapper[4822]: I1010 07:41:38.995421 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:38 crc kubenswrapper[4822]: I1010 07:41:38.995883 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:41 crc kubenswrapper[4822]: I1010 07:41:41.070649 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 10 07:41:41 crc kubenswrapper[4822]: I1010 07:41:41.159922 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 10 07:42:01 crc kubenswrapper[4822]: I1010 07:42:01.564630 4822 generic.go:334] "Generic (PLEG): container finished" podID="61a46496-ab17-4e29-a6df-ad6342f79139" containerID="f416e7ec2a96c6d518e22f30c62d024fc2a5e419ae8f59d187b9933c13ed0023" exitCode=0 Oct 10 07:42:01 crc kubenswrapper[4822]: I1010 07:42:01.564835 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61a46496-ab17-4e29-a6df-ad6342f79139","Type":"ContainerDied","Data":"f416e7ec2a96c6d518e22f30c62d024fc2a5e419ae8f59d187b9933c13ed0023"} Oct 10 07:42:01 crc kubenswrapper[4822]: I1010 07:42:01.567912 4822 generic.go:334] "Generic (PLEG): container finished" podID="dab00d16-02d5-42a5-95fc-88802736edf2" containerID="53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582" exitCode=0 Oct 10 07:42:01 crc kubenswrapper[4822]: I1010 07:42:01.567941 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dab00d16-02d5-42a5-95fc-88802736edf2","Type":"ContainerDied","Data":"53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582"} Oct 10 07:42:02 crc kubenswrapper[4822]: I1010 07:42:02.577034 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dab00d16-02d5-42a5-95fc-88802736edf2","Type":"ContainerStarted","Data":"6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8"} Oct 10 07:42:02 crc kubenswrapper[4822]: I1010 07:42:02.578432 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:02 crc kubenswrapper[4822]: I1010 07:42:02.579131 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61a46496-ab17-4e29-a6df-ad6342f79139","Type":"ContainerStarted","Data":"5a732a7713ac1c785306382bbe15c746f7083b44378a4606384f887fa5c71b37"} Oct 10 07:42:02 crc kubenswrapper[4822]: I1010 07:42:02.579356 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 10 07:42:02 crc kubenswrapper[4822]: I1010 07:42:02.598457 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.598438314 podStartE2EDuration="37.598438314s" podCreationTimestamp="2025-10-10 07:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:42:02.597036984 +0000 UTC m=+4669.692195210" watchObservedRunningTime="2025-10-10 07:42:02.598438314 +0000 UTC m=+4669.693596510" Oct 10 07:42:02 crc kubenswrapper[4822]: I1010 07:42:02.626682 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.626662079 podStartE2EDuration="38.626662079s" podCreationTimestamp="2025-10-10 07:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:42:02.622213161 +0000 UTC m=+4669.717371377" watchObservedRunningTime="2025-10-10 07:42:02.626662079 +0000 UTC m=+4669.721820285" Oct 10 07:42:16 crc kubenswrapper[4822]: I1010 07:42:16.298330 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 10 07:42:16 crc kubenswrapper[4822]: I1010 07:42:16.599815 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.845315 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-tdw2q"] Oct 10 07:42:21 crc kubenswrapper[4822]: E1010 07:42:21.846270 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerName="init" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846288 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerName="init" Oct 10 07:42:21 crc kubenswrapper[4822]: E1010 07:42:21.846313 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="registry-server" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846321 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="registry-server" Oct 10 07:42:21 crc kubenswrapper[4822]: E1010 07:42:21.846344 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="extract-content" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846353 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="extract-content" Oct 10 07:42:21 crc kubenswrapper[4822]: E1010 07:42:21.846368 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="extract-utilities" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846376 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="extract-utilities" Oct 10 07:42:21 crc kubenswrapper[4822]: E1010 07:42:21.846389 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerName="dnsmasq-dns" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846398 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerName="dnsmasq-dns" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846566 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f599a9e7-b6b8-4a05-8acf-545095921c0a" containerName="registry-server" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.846580 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="91388f2d-2b39-4988-a4b2-943db1e2da06" containerName="dnsmasq-dns" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.847670 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.862916 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-tdw2q"] Oct 10 07:42:21 crc kubenswrapper[4822]: I1010 07:42:21.999672 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-config\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:21.999744 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzqdl\" (UniqueName: \"kubernetes.io/projected/6ea69b87-1caf-4ae2-9779-7695ce42f965-kube-api-access-mzqdl\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:21.999790 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.101325 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-config\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.101515 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzqdl\" (UniqueName: \"kubernetes.io/projected/6ea69b87-1caf-4ae2-9779-7695ce42f965-kube-api-access-mzqdl\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.101634 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.102490 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-config\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.103373 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.120559 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzqdl\" (UniqueName: \"kubernetes.io/projected/6ea69b87-1caf-4ae2-9779-7695ce42f965-kube-api-access-mzqdl\") pod \"dnsmasq-dns-5b7946d7b9-tdw2q\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.164510 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.493482 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.615001 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-tdw2q"] Oct 10 07:42:22 crc kubenswrapper[4822]: I1010 07:42:22.727634 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" event={"ID":"6ea69b87-1caf-4ae2-9779-7695ce42f965","Type":"ContainerStarted","Data":"8caae1610af659423fe11a9e750b3bac5b7af93e299dd50b953aafac5130e515"} Oct 10 07:42:23 crc kubenswrapper[4822]: I1010 07:42:23.262393 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:42:23 crc kubenswrapper[4822]: I1010 07:42:23.735133 4822 generic.go:334] "Generic (PLEG): container finished" podID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerID="06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886" exitCode=0 Oct 10 07:42:23 crc kubenswrapper[4822]: I1010 07:42:23.735185 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" event={"ID":"6ea69b87-1caf-4ae2-9779-7695ce42f965","Type":"ContainerDied","Data":"06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886"} Oct 10 07:42:24 crc kubenswrapper[4822]: I1010 07:42:24.472269 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="rabbitmq" containerID="cri-o://5a732a7713ac1c785306382bbe15c746f7083b44378a4606384f887fa5c71b37" gracePeriod=604799 Oct 10 07:42:24 crc kubenswrapper[4822]: I1010 07:42:24.746421 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" event={"ID":"6ea69b87-1caf-4ae2-9779-7695ce42f965","Type":"ContainerStarted","Data":"77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070"} Oct 10 07:42:24 crc kubenswrapper[4822]: I1010 07:42:24.746767 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:24 crc kubenswrapper[4822]: I1010 07:42:24.769252 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" podStartSLOduration=3.769232532 podStartE2EDuration="3.769232532s" podCreationTimestamp="2025-10-10 07:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:42:24.764489886 +0000 UTC m=+4691.859648162" watchObservedRunningTime="2025-10-10 07:42:24.769232532 +0000 UTC m=+4691.864390748" Oct 10 07:42:25 crc kubenswrapper[4822]: I1010 07:42:25.143234 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="rabbitmq" containerID="cri-o://6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8" gracePeriod=604799 Oct 10 07:42:26 crc kubenswrapper[4822]: I1010 07:42:26.287395 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5672: connect: connection refused" Oct 10 07:42:26 crc kubenswrapper[4822]: I1010 07:42:26.597258 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.241:5672: connect: connection refused" Oct 10 07:42:30 crc kubenswrapper[4822]: I1010 07:42:30.831622 4822 generic.go:334] "Generic (PLEG): container finished" podID="61a46496-ab17-4e29-a6df-ad6342f79139" containerID="5a732a7713ac1c785306382bbe15c746f7083b44378a4606384f887fa5c71b37" exitCode=0 Oct 10 07:42:30 crc kubenswrapper[4822]: I1010 07:42:30.831740 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61a46496-ab17-4e29-a6df-ad6342f79139","Type":"ContainerDied","Data":"5a732a7713ac1c785306382bbe15c746f7083b44378a4606384f887fa5c71b37"} Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.144164 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.170369 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61a46496-ab17-4e29-a6df-ad6342f79139-erlang-cookie-secret\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.170456 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8g6h\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-kube-api-access-q8g6h\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.171874 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.173062 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-plugins-conf\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.173150 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-server-conf\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.173196 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61a46496-ab17-4e29-a6df-ad6342f79139-pod-info\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.173247 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-confd\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.173302 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-plugins\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.173360 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-erlang-cookie\") pod \"61a46496-ab17-4e29-a6df-ad6342f79139\" (UID: \"61a46496-ab17-4e29-a6df-ad6342f79139\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.174305 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.182135 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.182789 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a46496-ab17-4e29-a6df-ad6342f79139-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.183019 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.186731 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-kube-api-access-q8g6h" (OuterVolumeSpecName: "kube-api-access-q8g6h") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "kube-api-access-q8g6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.187669 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/61a46496-ab17-4e29-a6df-ad6342f79139-pod-info" (OuterVolumeSpecName: "pod-info") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.241158 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad" (OuterVolumeSpecName: "persistence") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.244625 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-server-conf" (OuterVolumeSpecName: "server-conf") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275422 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275457 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275470 4822 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/61a46496-ab17-4e29-a6df-ad6342f79139-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275482 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8g6h\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-kube-api-access-q8g6h\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275523 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") on node \"crc\" " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275538 4822 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275548 4822 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/61a46496-ab17-4e29-a6df-ad6342f79139-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.275556 4822 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/61a46496-ab17-4e29-a6df-ad6342f79139-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.291981 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "61a46496-ab17-4e29-a6df-ad6342f79139" (UID: "61a46496-ab17-4e29-a6df-ad6342f79139"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.296312 4822 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.296699 4822 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad") on node "crc" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.376481 4822 reconciler_common.go:293] "Volume detached for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.376534 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/61a46496-ab17-4e29-a6df-ad6342f79139-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.759177 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.844414 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"61a46496-ab17-4e29-a6df-ad6342f79139","Type":"ContainerDied","Data":"b202148f7cd6b3e5b3ed171b3703ab43c71931189b2baa11cbef4597210e7b24"} Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.844473 4822 scope.go:117] "RemoveContainer" containerID="5a732a7713ac1c785306382bbe15c746f7083b44378a4606384f887fa5c71b37" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.844637 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.850319 4822 generic.go:334] "Generic (PLEG): container finished" podID="dab00d16-02d5-42a5-95fc-88802736edf2" containerID="6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8" exitCode=0 Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.850369 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dab00d16-02d5-42a5-95fc-88802736edf2","Type":"ContainerDied","Data":"6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8"} Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.850378 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.850397 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dab00d16-02d5-42a5-95fc-88802736edf2","Type":"ContainerDied","Data":"bf3b5b5a6c73eee06e66c5fefc52216eee2444963e856d65ce2f72f083c68996"} Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.873592 4822 scope.go:117] "RemoveContainer" containerID="f416e7ec2a96c6d518e22f30c62d024fc2a5e419ae8f59d187b9933c13ed0023" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.875875 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.880644 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.881377 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-plugins\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.881473 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dab00d16-02d5-42a5-95fc-88802736edf2-pod-info\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.881568 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wj77\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-kube-api-access-6wj77\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882131 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882231 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-plugins-conf\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882314 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-server-conf\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882401 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-erlang-cookie\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882547 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-confd\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882680 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dab00d16-02d5-42a5-95fc-88802736edf2-erlang-cookie-secret\") pod \"dab00d16-02d5-42a5-95fc-88802736edf2\" (UID: \"dab00d16-02d5-42a5-95fc-88802736edf2\") " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.882180 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.885911 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dab00d16-02d5-42a5-95fc-88802736edf2-pod-info" (OuterVolumeSpecName: "pod-info") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.886190 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.886279 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-kube-api-access-6wj77" (OuterVolumeSpecName: "kube-api-access-6wj77") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "kube-api-access-6wj77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.887655 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.906649 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-server-conf" (OuterVolumeSpecName: "server-conf") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.907016 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab00d16-02d5-42a5-95fc-88802736edf2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.912899 4822 scope.go:117] "RemoveContainer" containerID="6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.924246 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65" (OuterVolumeSpecName: "persistence") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "pvc-17aa32a7-f754-4151-aed0-1c20badb3d65". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933064 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:42:31 crc kubenswrapper[4822]: E1010 07:42:31.933445 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="rabbitmq" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933468 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="rabbitmq" Oct 10 07:42:31 crc kubenswrapper[4822]: E1010 07:42:31.933485 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="rabbitmq" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933494 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="rabbitmq" Oct 10 07:42:31 crc kubenswrapper[4822]: E1010 07:42:31.933505 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="setup-container" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933513 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="setup-container" Oct 10 07:42:31 crc kubenswrapper[4822]: E1010 07:42:31.933535 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="setup-container" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933543 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="setup-container" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933707 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" containerName="rabbitmq" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.933723 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" containerName="rabbitmq" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.934640 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.941053 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.941460 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.941760 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sv4bl" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.942597 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.951154 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.973926 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.984910 4822 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dab00d16-02d5-42a5-95fc-88802736edf2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.984944 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.984956 4822 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dab00d16-02d5-42a5-95fc-88802736edf2-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.984969 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wj77\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-kube-api-access-6wj77\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.984998 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") on node \"crc\" " Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.985011 4822 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.985023 4822 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dab00d16-02d5-42a5-95fc-88802736edf2-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:31 crc kubenswrapper[4822]: I1010 07:42:31.985034 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.000599 4822 scope.go:117] "RemoveContainer" containerID="53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.086824 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.086873 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.086906 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6122a003-1c32-48ac-a2f8-902b898977ca-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.086940 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.086967 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6122a003-1c32-48ac-a2f8-902b898977ca-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.086996 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphfv\" (UniqueName: \"kubernetes.io/projected/6122a003-1c32-48ac-a2f8-902b898977ca-kube-api-access-xphfv\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.087083 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6122a003-1c32-48ac-a2f8-902b898977ca-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.087223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.087248 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6122a003-1c32-48ac-a2f8-902b898977ca-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.166237 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.188906 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphfv\" (UniqueName: \"kubernetes.io/projected/6122a003-1c32-48ac-a2f8-902b898977ca-kube-api-access-xphfv\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.188974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6122a003-1c32-48ac-a2f8-902b898977ca-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189028 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189052 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6122a003-1c32-48ac-a2f8-902b898977ca-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189096 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189163 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6122a003-1c32-48ac-a2f8-902b898977ca-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189204 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.189227 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6122a003-1c32-48ac-a2f8-902b898977ca-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.191912 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6122a003-1c32-48ac-a2f8-902b898977ca-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.193189 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.195874 4822 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.196061 4822 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-17aa32a7-f754-4151-aed0-1c20badb3d65" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65") on node "crc" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.196435 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.196423 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6122a003-1c32-48ac-a2f8-902b898977ca-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.196511 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6122a003-1c32-48ac-a2f8-902b898977ca-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.197731 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.197755 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7e19833da2871e608863fda1c55a3dff7d80fba167ff1bbc70233ad2f6e9936b/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.198041 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6122a003-1c32-48ac-a2f8-902b898977ca-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.202043 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6122a003-1c32-48ac-a2f8-902b898977ca-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.206166 4822 scope.go:117] "RemoveContainer" containerID="6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8" Oct 10 07:42:32 crc kubenswrapper[4822]: E1010 07:42:32.207360 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8\": container with ID starting with 6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8 not found: ID does not exist" containerID="6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.207389 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8"} err="failed to get container status \"6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8\": rpc error: code = NotFound desc = could not find container \"6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8\": container with ID starting with 6193a0cdfb366a707a9a30d3ca8107245bb1efa34727a9948b568556a6ca00f8 not found: ID does not exist" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.207412 4822 scope.go:117] "RemoveContainer" containerID="53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582" Oct 10 07:42:32 crc kubenswrapper[4822]: E1010 07:42:32.209917 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582\": container with ID starting with 53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582 not found: ID does not exist" containerID="53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.209953 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582"} err="failed to get container status \"53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582\": rpc error: code = NotFound desc = could not find container \"53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582\": container with ID starting with 53b0c104df91dceaced7e9fcabec96eed8265247bde1582b3d4327578264f582 not found: ID does not exist" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.227705 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphfv\" (UniqueName: \"kubernetes.io/projected/6122a003-1c32-48ac-a2f8-902b898977ca-kube-api-access-xphfv\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.231288 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vdm6z"] Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.231498 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerName="dnsmasq-dns" containerID="cri-o://3e47d42ea7a8a1c83d3749a5f423c560d2632c77e57662e21e61300f45cf9902" gracePeriod=10 Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.258690 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dab00d16-02d5-42a5-95fc-88802736edf2" (UID: "dab00d16-02d5-42a5-95fc-88802736edf2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.266424 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a50ee75-0190-4cd6-a902-3e704fe365ad\") pod \"rabbitmq-server-0\" (UID: \"6122a003-1c32-48ac-a2f8-902b898977ca\") " pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.287235 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.290248 4822 reconciler_common.go:293] "Volume detached for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.290277 4822 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dab00d16-02d5-42a5-95fc-88802736edf2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.497359 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.504132 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.510524 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.511911 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.514647 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.514992 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.515678 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jpb2q" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.515920 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.516996 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.522751 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.532320 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696284 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696349 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696395 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696436 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696646 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696728 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct862\" (UniqueName: \"kubernetes.io/projected/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-kube-api-access-ct862\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696761 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.696841 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.697008 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798094 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798169 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798233 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798274 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct862\" (UniqueName: \"kubernetes.io/projected/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-kube-api-access-ct862\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798305 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798349 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798380 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798425 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.798469 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.799840 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.799921 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.800933 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.802459 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.802512 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a75def2ebe05f400a36c3ff2d587ad52f06bc348c19f821d527d36d8a0b81552/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.804851 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.804893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.805707 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.808325 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.816591 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct862\" (UniqueName: \"kubernetes.io/projected/3e8c0f60-e5ce-4c39-bf4b-7a9483088159-kube-api-access-ct862\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.851688 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17aa32a7-f754-4151-aed0-1c20badb3d65\") pod \"rabbitmq-cell1-server-0\" (UID: \"3e8c0f60-e5ce-4c39-bf4b-7a9483088159\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.860741 4822 generic.go:334] "Generic (PLEG): container finished" podID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerID="3e47d42ea7a8a1c83d3749a5f423c560d2632c77e57662e21e61300f45cf9902" exitCode=0 Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.860833 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" event={"ID":"5391ed5a-94fe-42ec-9c74-034697b6950f","Type":"ContainerDied","Data":"3e47d42ea7a8a1c83d3749a5f423c560d2632c77e57662e21e61300f45cf9902"} Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.860873 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" event={"ID":"5391ed5a-94fe-42ec-9c74-034697b6950f","Type":"ContainerDied","Data":"e53a2dca3627f32f7021378743a2a4cb5d9d4e448fe5fda25b3ccad821091c3f"} Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.860885 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e53a2dca3627f32f7021378743a2a4cb5d9d4e448fe5fda25b3ccad821091c3f" Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.863881 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6122a003-1c32-48ac-a2f8-902b898977ca","Type":"ContainerStarted","Data":"25ade5f4e73245d4eeb8cfae514e832a08149b66df75955a405cb1d0c342bf7a"} Oct 10 07:42:32 crc kubenswrapper[4822]: I1010 07:42:32.894159 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.006136 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-config\") pod \"5391ed5a-94fe-42ec-9c74-034697b6950f\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.006302 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-dns-svc\") pod \"5391ed5a-94fe-42ec-9c74-034697b6950f\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.006376 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/5391ed5a-94fe-42ec-9c74-034697b6950f-kube-api-access-zrsn8\") pod \"5391ed5a-94fe-42ec-9c74-034697b6950f\" (UID: \"5391ed5a-94fe-42ec-9c74-034697b6950f\") " Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.010793 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5391ed5a-94fe-42ec-9c74-034697b6950f-kube-api-access-zrsn8" (OuterVolumeSpecName: "kube-api-access-zrsn8") pod "5391ed5a-94fe-42ec-9c74-034697b6950f" (UID: "5391ed5a-94fe-42ec-9c74-034697b6950f"). InnerVolumeSpecName "kube-api-access-zrsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.053103 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-config" (OuterVolumeSpecName: "config") pod "5391ed5a-94fe-42ec-9c74-034697b6950f" (UID: "5391ed5a-94fe-42ec-9c74-034697b6950f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.086244 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5391ed5a-94fe-42ec-9c74-034697b6950f" (UID: "5391ed5a-94fe-42ec-9c74-034697b6950f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.107991 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.108033 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5391ed5a-94fe-42ec-9c74-034697b6950f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.108047 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrsn8\" (UniqueName: \"kubernetes.io/projected/5391ed5a-94fe-42ec-9c74-034697b6950f-kube-api-access-zrsn8\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.141618 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.441673 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:42:33 crc kubenswrapper[4822]: W1010 07:42:33.442693 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e8c0f60_e5ce_4c39_bf4b_7a9483088159.slice/crio-252daaead787065495fccb0fd18fcb32a50f4748d31a19dda533c1bd5e71410a WatchSource:0}: Error finding container 252daaead787065495fccb0fd18fcb32a50f4748d31a19dda533c1bd5e71410a: Status 404 returned error can't find the container with id 252daaead787065495fccb0fd18fcb32a50f4748d31a19dda533c1bd5e71410a Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.672974 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a46496-ab17-4e29-a6df-ad6342f79139" path="/var/lib/kubelet/pods/61a46496-ab17-4e29-a6df-ad6342f79139/volumes" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.675141 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab00d16-02d5-42a5-95fc-88802736edf2" path="/var/lib/kubelet/pods/dab00d16-02d5-42a5-95fc-88802736edf2/volumes" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.874272 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3e8c0f60-e5ce-4c39-bf4b-7a9483088159","Type":"ContainerStarted","Data":"252daaead787065495fccb0fd18fcb32a50f4748d31a19dda533c1bd5e71410a"} Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.874366 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vdm6z" Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.898187 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vdm6z"] Oct 10 07:42:33 crc kubenswrapper[4822]: I1010 07:42:33.905058 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vdm6z"] Oct 10 07:42:34 crc kubenswrapper[4822]: I1010 07:42:34.890947 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6122a003-1c32-48ac-a2f8-902b898977ca","Type":"ContainerStarted","Data":"4c3efd41dbf291d14ce2802a8a274efabd57d9436a6041a014de4f0a48a4ddd5"} Oct 10 07:42:34 crc kubenswrapper[4822]: I1010 07:42:34.893911 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3e8c0f60-e5ce-4c39-bf4b-7a9483088159","Type":"ContainerStarted","Data":"8cf5c02848e9c3e2228795b077ce0f0cdc2c88ae26e824d8af11bbace7460896"} Oct 10 07:42:35 crc kubenswrapper[4822]: I1010 07:42:35.666988 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" path="/var/lib/kubelet/pods/5391ed5a-94fe-42ec-9c74-034697b6950f/volumes" Oct 10 07:43:07 crc kubenswrapper[4822]: I1010 07:43:07.204276 4822 generic.go:334] "Generic (PLEG): container finished" podID="6122a003-1c32-48ac-a2f8-902b898977ca" containerID="4c3efd41dbf291d14ce2802a8a274efabd57d9436a6041a014de4f0a48a4ddd5" exitCode=0 Oct 10 07:43:07 crc kubenswrapper[4822]: I1010 07:43:07.204540 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6122a003-1c32-48ac-a2f8-902b898977ca","Type":"ContainerDied","Data":"4c3efd41dbf291d14ce2802a8a274efabd57d9436a6041a014de4f0a48a4ddd5"} Oct 10 07:43:08 crc kubenswrapper[4822]: I1010 07:43:08.214449 4822 generic.go:334] "Generic (PLEG): container finished" podID="3e8c0f60-e5ce-4c39-bf4b-7a9483088159" containerID="8cf5c02848e9c3e2228795b077ce0f0cdc2c88ae26e824d8af11bbace7460896" exitCode=0 Oct 10 07:43:08 crc kubenswrapper[4822]: I1010 07:43:08.214535 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3e8c0f60-e5ce-4c39-bf4b-7a9483088159","Type":"ContainerDied","Data":"8cf5c02848e9c3e2228795b077ce0f0cdc2c88ae26e824d8af11bbace7460896"} Oct 10 07:43:08 crc kubenswrapper[4822]: I1010 07:43:08.222974 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6122a003-1c32-48ac-a2f8-902b898977ca","Type":"ContainerStarted","Data":"390265eceb989243bd8c28f32bc6983b3694833f81c1b9e97e3f7d9018174a69"} Oct 10 07:43:08 crc kubenswrapper[4822]: I1010 07:43:08.223292 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 10 07:43:08 crc kubenswrapper[4822]: I1010 07:43:08.276887 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.276864568 podStartE2EDuration="37.276864568s" podCreationTimestamp="2025-10-10 07:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:43:08.270777952 +0000 UTC m=+4735.365936178" watchObservedRunningTime="2025-10-10 07:43:08.276864568 +0000 UTC m=+4735.372022784" Oct 10 07:43:09 crc kubenswrapper[4822]: I1010 07:43:09.234891 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3e8c0f60-e5ce-4c39-bf4b-7a9483088159","Type":"ContainerStarted","Data":"03577f39ad3faec09c1e15befa809da103a87c8672f2139e370f390afff5838a"} Oct 10 07:43:09 crc kubenswrapper[4822]: I1010 07:43:09.235609 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:43:09 crc kubenswrapper[4822]: I1010 07:43:09.260208 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.260175662 podStartE2EDuration="37.260175662s" podCreationTimestamp="2025-10-10 07:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:43:09.254056235 +0000 UTC m=+4736.349214451" watchObservedRunningTime="2025-10-10 07:43:09.260175662 +0000 UTC m=+4736.355333898" Oct 10 07:43:22 crc kubenswrapper[4822]: I1010 07:43:22.290779 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 10 07:43:23 crc kubenswrapper[4822]: I1010 07:43:23.146124 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.056524 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 07:43:30 crc kubenswrapper[4822]: E1010 07:43:30.057532 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerName="dnsmasq-dns" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.057552 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerName="dnsmasq-dns" Oct 10 07:43:30 crc kubenswrapper[4822]: E1010 07:43:30.057586 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerName="init" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.057598 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerName="init" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.057959 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="5391ed5a-94fe-42ec-9c74-034697b6950f" containerName="dnsmasq-dns" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.058723 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.060576 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-v2br5" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.061595 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.099457 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vggk\" (UniqueName: \"kubernetes.io/projected/2a156d5a-c6be-478a-b05f-c840f01792be-kube-api-access-6vggk\") pod \"mariadb-client-1-default\" (UID: \"2a156d5a-c6be-478a-b05f-c840f01792be\") " pod="openstack/mariadb-client-1-default" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.200628 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vggk\" (UniqueName: \"kubernetes.io/projected/2a156d5a-c6be-478a-b05f-c840f01792be-kube-api-access-6vggk\") pod \"mariadb-client-1-default\" (UID: \"2a156d5a-c6be-478a-b05f-c840f01792be\") " pod="openstack/mariadb-client-1-default" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.240317 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vggk\" (UniqueName: \"kubernetes.io/projected/2a156d5a-c6be-478a-b05f-c840f01792be-kube-api-access-6vggk\") pod \"mariadb-client-1-default\" (UID: \"2a156d5a-c6be-478a-b05f-c840f01792be\") " pod="openstack/mariadb-client-1-default" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.412783 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 07:43:30 crc kubenswrapper[4822]: I1010 07:43:30.803217 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 07:43:31 crc kubenswrapper[4822]: I1010 07:43:31.336556 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:43:31 crc kubenswrapper[4822]: I1010 07:43:31.336667 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:43:31 crc kubenswrapper[4822]: I1010 07:43:31.437049 4822 generic.go:334] "Generic (PLEG): container finished" podID="2a156d5a-c6be-478a-b05f-c840f01792be" containerID="8ae9c0019aa5d20a2316264924237ae7f7641162b70f36ec35daf6193e73bf14" exitCode=0 Oct 10 07:43:31 crc kubenswrapper[4822]: I1010 07:43:31.437111 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"2a156d5a-c6be-478a-b05f-c840f01792be","Type":"ContainerDied","Data":"8ae9c0019aa5d20a2316264924237ae7f7641162b70f36ec35daf6193e73bf14"} Oct 10 07:43:31 crc kubenswrapper[4822]: I1010 07:43:31.437150 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"2a156d5a-c6be-478a-b05f-c840f01792be","Type":"ContainerStarted","Data":"43d7af536b964050b00d9ffac11cd7f1aef5ff2f3560532b0d25ec9b8cf23e7e"} Oct 10 07:43:32 crc kubenswrapper[4822]: I1010 07:43:32.925903 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 07:43:32 crc kubenswrapper[4822]: I1010 07:43:32.953362 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_2a156d5a-c6be-478a-b05f-c840f01792be/mariadb-client-1-default/0.log" Oct 10 07:43:32 crc kubenswrapper[4822]: I1010 07:43:32.988113 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.015153 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.066776 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vggk\" (UniqueName: \"kubernetes.io/projected/2a156d5a-c6be-478a-b05f-c840f01792be-kube-api-access-6vggk\") pod \"2a156d5a-c6be-478a-b05f-c840f01792be\" (UID: \"2a156d5a-c6be-478a-b05f-c840f01792be\") " Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.089522 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a156d5a-c6be-478a-b05f-c840f01792be-kube-api-access-6vggk" (OuterVolumeSpecName: "kube-api-access-6vggk") pod "2a156d5a-c6be-478a-b05f-c840f01792be" (UID: "2a156d5a-c6be-478a-b05f-c840f01792be"). InnerVolumeSpecName "kube-api-access-6vggk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.169104 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vggk\" (UniqueName: \"kubernetes.io/projected/2a156d5a-c6be-478a-b05f-c840f01792be-kube-api-access-6vggk\") on node \"crc\" DevicePath \"\"" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.460658 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d7af536b964050b00d9ffac11cd7f1aef5ff2f3560532b0d25ec9b8cf23e7e" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.461247 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.488239 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 07:43:33 crc kubenswrapper[4822]: E1010 07:43:33.488883 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a156d5a-c6be-478a-b05f-c840f01792be" containerName="mariadb-client-1-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.488927 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a156d5a-c6be-478a-b05f-c840f01792be" containerName="mariadb-client-1-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.489288 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a156d5a-c6be-478a-b05f-c840f01792be" containerName="mariadb-client-1-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.490669 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.498016 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-v2br5" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.513580 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.670646 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a156d5a-c6be-478a-b05f-c840f01792be" path="/var/lib/kubelet/pods/2a156d5a-c6be-478a-b05f-c840f01792be/volumes" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.676783 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxfl\" (UniqueName: \"kubernetes.io/projected/7d1f822e-9a81-4a71-b629-573262584e6f-kube-api-access-vbxfl\") pod \"mariadb-client-2-default\" (UID: \"7d1f822e-9a81-4a71-b629-573262584e6f\") " pod="openstack/mariadb-client-2-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.778340 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxfl\" (UniqueName: \"kubernetes.io/projected/7d1f822e-9a81-4a71-b629-573262584e6f-kube-api-access-vbxfl\") pod \"mariadb-client-2-default\" (UID: \"7d1f822e-9a81-4a71-b629-573262584e6f\") " pod="openstack/mariadb-client-2-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.809558 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxfl\" (UniqueName: \"kubernetes.io/projected/7d1f822e-9a81-4a71-b629-573262584e6f-kube-api-access-vbxfl\") pod \"mariadb-client-2-default\" (UID: \"7d1f822e-9a81-4a71-b629-573262584e6f\") " pod="openstack/mariadb-client-2-default" Oct 10 07:43:33 crc kubenswrapper[4822]: I1010 07:43:33.832643 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 07:43:34 crc kubenswrapper[4822]: I1010 07:43:34.256828 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 07:43:34 crc kubenswrapper[4822]: W1010 07:43:34.257715 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1f822e_9a81_4a71_b629_573262584e6f.slice/crio-043ce774d5644afa013e0c3bbfa883d182a2e67df6933304782698d0b1456b71 WatchSource:0}: Error finding container 043ce774d5644afa013e0c3bbfa883d182a2e67df6933304782698d0b1456b71: Status 404 returned error can't find the container with id 043ce774d5644afa013e0c3bbfa883d182a2e67df6933304782698d0b1456b71 Oct 10 07:43:34 crc kubenswrapper[4822]: I1010 07:43:34.469821 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"7d1f822e-9a81-4a71-b629-573262584e6f","Type":"ContainerStarted","Data":"e9f7058d53ab00c95779d74c02dd026312193e81c8ba0606a55dae7efdb833ec"} Oct 10 07:43:34 crc kubenswrapper[4822]: I1010 07:43:34.470123 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"7d1f822e-9a81-4a71-b629-573262584e6f","Type":"ContainerStarted","Data":"043ce774d5644afa013e0c3bbfa883d182a2e67df6933304782698d0b1456b71"} Oct 10 07:43:34 crc kubenswrapper[4822]: I1010 07:43:34.488521 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.488478317 podStartE2EDuration="1.488478317s" podCreationTimestamp="2025-10-10 07:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:43:34.488094666 +0000 UTC m=+4761.583252922" watchObservedRunningTime="2025-10-10 07:43:34.488478317 +0000 UTC m=+4761.583636513" Oct 10 07:43:35 crc kubenswrapper[4822]: I1010 07:43:35.481481 4822 generic.go:334] "Generic (PLEG): container finished" podID="7d1f822e-9a81-4a71-b629-573262584e6f" containerID="e9f7058d53ab00c95779d74c02dd026312193e81c8ba0606a55dae7efdb833ec" exitCode=0 Oct 10 07:43:35 crc kubenswrapper[4822]: I1010 07:43:35.481568 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"7d1f822e-9a81-4a71-b629-573262584e6f","Type":"ContainerDied","Data":"e9f7058d53ab00c95779d74c02dd026312193e81c8ba0606a55dae7efdb833ec"} Oct 10 07:43:36 crc kubenswrapper[4822]: I1010 07:43:36.944304 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.007968 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.018036 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.135763 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbxfl\" (UniqueName: \"kubernetes.io/projected/7d1f822e-9a81-4a71-b629-573262584e6f-kube-api-access-vbxfl\") pod \"7d1f822e-9a81-4a71-b629-573262584e6f\" (UID: \"7d1f822e-9a81-4a71-b629-573262584e6f\") " Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.144399 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1f822e-9a81-4a71-b629-573262584e6f-kube-api-access-vbxfl" (OuterVolumeSpecName: "kube-api-access-vbxfl") pod "7d1f822e-9a81-4a71-b629-573262584e6f" (UID: "7d1f822e-9a81-4a71-b629-573262584e6f"). InnerVolumeSpecName "kube-api-access-vbxfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.238052 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbxfl\" (UniqueName: \"kubernetes.io/projected/7d1f822e-9a81-4a71-b629-573262584e6f-kube-api-access-vbxfl\") on node \"crc\" DevicePath \"\"" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.506899 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="043ce774d5644afa013e0c3bbfa883d182a2e67df6933304782698d0b1456b71" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.507019 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.523003 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 10 07:43:37 crc kubenswrapper[4822]: E1010 07:43:37.523542 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1f822e-9a81-4a71-b629-573262584e6f" containerName="mariadb-client-2-default" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.523582 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1f822e-9a81-4a71-b629-573262584e6f" containerName="mariadb-client-2-default" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.524017 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1f822e-9a81-4a71-b629-573262584e6f" containerName="mariadb-client-2-default" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.527170 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.535113 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-v2br5" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.553619 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.644494 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlznp\" (UniqueName: \"kubernetes.io/projected/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a-kube-api-access-rlznp\") pod \"mariadb-client-1\" (UID: \"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a\") " pod="openstack/mariadb-client-1" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.664420 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1f822e-9a81-4a71-b629-573262584e6f" path="/var/lib/kubelet/pods/7d1f822e-9a81-4a71-b629-573262584e6f/volumes" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.746083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlznp\" (UniqueName: \"kubernetes.io/projected/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a-kube-api-access-rlznp\") pod \"mariadb-client-1\" (UID: \"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a\") " pod="openstack/mariadb-client-1" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.772650 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlznp\" (UniqueName: \"kubernetes.io/projected/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a-kube-api-access-rlznp\") pod \"mariadb-client-1\" (UID: \"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a\") " pod="openstack/mariadb-client-1" Oct 10 07:43:37 crc kubenswrapper[4822]: I1010 07:43:37.877418 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 07:43:38 crc kubenswrapper[4822]: I1010 07:43:38.519394 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 07:43:39 crc kubenswrapper[4822]: I1010 07:43:39.541678 4822 generic.go:334] "Generic (PLEG): container finished" podID="e3c1bdb9-52b8-470c-9233-bb2cf7b1498a" containerID="0d5a761e2c52bf6d8078780afe6ae515e274759c6d54b37bd91966d62e886d41" exitCode=0 Oct 10 07:43:39 crc kubenswrapper[4822]: I1010 07:43:39.541742 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a","Type":"ContainerDied","Data":"0d5a761e2c52bf6d8078780afe6ae515e274759c6d54b37bd91966d62e886d41"} Oct 10 07:43:39 crc kubenswrapper[4822]: I1010 07:43:39.542020 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a","Type":"ContainerStarted","Data":"7a32aae77e5548172b2a7fa5ca2d6588ac027fdb0ebb9a39a8a268c87989752b"} Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:40.931025 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:40.949959 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_e3c1bdb9-52b8-470c-9233-bb2cf7b1498a/mariadb-client-1/0.log" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:40.997677 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.008042 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.106186 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlznp\" (UniqueName: \"kubernetes.io/projected/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a-kube-api-access-rlznp\") pod \"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a\" (UID: \"e3c1bdb9-52b8-470c-9233-bb2cf7b1498a\") " Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.117555 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a-kube-api-access-rlznp" (OuterVolumeSpecName: "kube-api-access-rlznp") pod "e3c1bdb9-52b8-470c-9233-bb2cf7b1498a" (UID: "e3c1bdb9-52b8-470c-9233-bb2cf7b1498a"). InnerVolumeSpecName "kube-api-access-rlznp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.207961 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlznp\" (UniqueName: \"kubernetes.io/projected/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a-kube-api-access-rlznp\") on node \"crc\" DevicePath \"\"" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.476788 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 07:43:41 crc kubenswrapper[4822]: E1010 07:43:41.477209 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c1bdb9-52b8-470c-9233-bb2cf7b1498a" containerName="mariadb-client-1" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.477232 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c1bdb9-52b8-470c-9233-bb2cf7b1498a" containerName="mariadb-client-1" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.477471 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c1bdb9-52b8-470c-9233-bb2cf7b1498a" containerName="mariadb-client-1" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.478219 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.500070 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.522125 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8nkc\" (UniqueName: \"kubernetes.io/projected/30054749-819f-405a-b89e-1172238b804d-kube-api-access-p8nkc\") pod \"mariadb-client-4-default\" (UID: \"30054749-819f-405a-b89e-1172238b804d\") " pod="openstack/mariadb-client-4-default" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.563473 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a32aae77e5548172b2a7fa5ca2d6588ac027fdb0ebb9a39a8a268c87989752b" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.563513 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.623260 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8nkc\" (UniqueName: \"kubernetes.io/projected/30054749-819f-405a-b89e-1172238b804d-kube-api-access-p8nkc\") pod \"mariadb-client-4-default\" (UID: \"30054749-819f-405a-b89e-1172238b804d\") " pod="openstack/mariadb-client-4-default" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.644264 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8nkc\" (UniqueName: \"kubernetes.io/projected/30054749-819f-405a-b89e-1172238b804d-kube-api-access-p8nkc\") pod \"mariadb-client-4-default\" (UID: \"30054749-819f-405a-b89e-1172238b804d\") " pod="openstack/mariadb-client-4-default" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.663119 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c1bdb9-52b8-470c-9233-bb2cf7b1498a" path="/var/lib/kubelet/pods/e3c1bdb9-52b8-470c-9233-bb2cf7b1498a/volumes" Oct 10 07:43:41 crc kubenswrapper[4822]: I1010 07:43:41.807023 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 07:43:42 crc kubenswrapper[4822]: I1010 07:43:42.392349 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 07:43:42 crc kubenswrapper[4822]: I1010 07:43:42.575673 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"30054749-819f-405a-b89e-1172238b804d","Type":"ContainerStarted","Data":"6d1366868c83d471d32491451c9423bd8b53a13da962f3e862b59a9993c3253e"} Oct 10 07:43:43 crc kubenswrapper[4822]: I1010 07:43:43.589757 4822 generic.go:334] "Generic (PLEG): container finished" podID="30054749-819f-405a-b89e-1172238b804d" containerID="47d279f999265c83ca04ecafe09a0c894a410f3fd2521090f70517562a68f9a6" exitCode=0 Oct 10 07:43:43 crc kubenswrapper[4822]: I1010 07:43:43.589838 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"30054749-819f-405a-b89e-1172238b804d","Type":"ContainerDied","Data":"47d279f999265c83ca04ecafe09a0c894a410f3fd2521090f70517562a68f9a6"} Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.089210 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.113607 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_30054749-819f-405a-b89e-1172238b804d/mariadb-client-4-default/0.log" Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.174134 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.179219 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.180049 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8nkc\" (UniqueName: \"kubernetes.io/projected/30054749-819f-405a-b89e-1172238b804d-kube-api-access-p8nkc\") pod \"30054749-819f-405a-b89e-1172238b804d\" (UID: \"30054749-819f-405a-b89e-1172238b804d\") " Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.187588 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30054749-819f-405a-b89e-1172238b804d-kube-api-access-p8nkc" (OuterVolumeSpecName: "kube-api-access-p8nkc") pod "30054749-819f-405a-b89e-1172238b804d" (UID: "30054749-819f-405a-b89e-1172238b804d"). InnerVolumeSpecName "kube-api-access-p8nkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.282231 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8nkc\" (UniqueName: \"kubernetes.io/projected/30054749-819f-405a-b89e-1172238b804d-kube-api-access-p8nkc\") on node \"crc\" DevicePath \"\"" Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.618760 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1366868c83d471d32491451c9423bd8b53a13da962f3e862b59a9993c3253e" Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.618917 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 07:43:45 crc kubenswrapper[4822]: I1010 07:43:45.663082 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30054749-819f-405a-b89e-1172238b804d" path="/var/lib/kubelet/pods/30054749-819f-405a-b89e-1172238b804d/volumes" Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.951939 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 07:43:49 crc kubenswrapper[4822]: E1010 07:43:49.953459 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30054749-819f-405a-b89e-1172238b804d" containerName="mariadb-client-4-default" Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.953493 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="30054749-819f-405a-b89e-1172238b804d" containerName="mariadb-client-4-default" Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.954028 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="30054749-819f-405a-b89e-1172238b804d" containerName="mariadb-client-4-default" Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.955105 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.958790 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-v2br5" Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.963657 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 07:43:49 crc kubenswrapper[4822]: I1010 07:43:49.969786 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kbb4\" (UniqueName: \"kubernetes.io/projected/e745359b-a5c6-4d32-93af-c1bb8687adea-kube-api-access-2kbb4\") pod \"mariadb-client-5-default\" (UID: \"e745359b-a5c6-4d32-93af-c1bb8687adea\") " pod="openstack/mariadb-client-5-default" Oct 10 07:43:50 crc kubenswrapper[4822]: I1010 07:43:50.071178 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kbb4\" (UniqueName: \"kubernetes.io/projected/e745359b-a5c6-4d32-93af-c1bb8687adea-kube-api-access-2kbb4\") pod \"mariadb-client-5-default\" (UID: \"e745359b-a5c6-4d32-93af-c1bb8687adea\") " pod="openstack/mariadb-client-5-default" Oct 10 07:43:50 crc kubenswrapper[4822]: I1010 07:43:50.102242 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kbb4\" (UniqueName: \"kubernetes.io/projected/e745359b-a5c6-4d32-93af-c1bb8687adea-kube-api-access-2kbb4\") pod \"mariadb-client-5-default\" (UID: \"e745359b-a5c6-4d32-93af-c1bb8687adea\") " pod="openstack/mariadb-client-5-default" Oct 10 07:43:50 crc kubenswrapper[4822]: I1010 07:43:50.284891 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 07:43:51 crc kubenswrapper[4822]: I1010 07:43:51.116749 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 07:43:51 crc kubenswrapper[4822]: I1010 07:43:51.742840 4822 generic.go:334] "Generic (PLEG): container finished" podID="e745359b-a5c6-4d32-93af-c1bb8687adea" containerID="f19a11301626fc440ddff88c409549a4156af9542a3172a9643bafb02954f395" exitCode=0 Oct 10 07:43:51 crc kubenswrapper[4822]: I1010 07:43:51.742912 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"e745359b-a5c6-4d32-93af-c1bb8687adea","Type":"ContainerDied","Data":"f19a11301626fc440ddff88c409549a4156af9542a3172a9643bafb02954f395"} Oct 10 07:43:51 crc kubenswrapper[4822]: I1010 07:43:51.743505 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"e745359b-a5c6-4d32-93af-c1bb8687adea","Type":"ContainerStarted","Data":"8ff3f5a0a261c2d4a6e2317ccb4d5f05b787916b2a8a8ff18bb17f0557008e5e"} Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.356948 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.379348 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_e745359b-a5c6-4d32-93af-c1bb8687adea/mariadb-client-5-default/0.log" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.413605 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.423207 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.528835 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kbb4\" (UniqueName: \"kubernetes.io/projected/e745359b-a5c6-4d32-93af-c1bb8687adea-kube-api-access-2kbb4\") pod \"e745359b-a5c6-4d32-93af-c1bb8687adea\" (UID: \"e745359b-a5c6-4d32-93af-c1bb8687adea\") " Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.538958 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e745359b-a5c6-4d32-93af-c1bb8687adea-kube-api-access-2kbb4" (OuterVolumeSpecName: "kube-api-access-2kbb4") pod "e745359b-a5c6-4d32-93af-c1bb8687adea" (UID: "e745359b-a5c6-4d32-93af-c1bb8687adea"). InnerVolumeSpecName "kube-api-access-2kbb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.595382 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 07:43:53 crc kubenswrapper[4822]: E1010 07:43:53.596024 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e745359b-a5c6-4d32-93af-c1bb8687adea" containerName="mariadb-client-5-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.596056 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e745359b-a5c6-4d32-93af-c1bb8687adea" containerName="mariadb-client-5-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.596365 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e745359b-a5c6-4d32-93af-c1bb8687adea" containerName="mariadb-client-5-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.597302 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.604341 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.631619 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kbb4\" (UniqueName: \"kubernetes.io/projected/e745359b-a5c6-4d32-93af-c1bb8687adea-kube-api-access-2kbb4\") on node \"crc\" DevicePath \"\"" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.662912 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e745359b-a5c6-4d32-93af-c1bb8687adea" path="/var/lib/kubelet/pods/e745359b-a5c6-4d32-93af-c1bb8687adea/volumes" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.733273 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztdz\" (UniqueName: \"kubernetes.io/projected/d2625a4c-b340-4745-bc3f-21877461b529-kube-api-access-hztdz\") pod \"mariadb-client-6-default\" (UID: \"d2625a4c-b340-4745-bc3f-21877461b529\") " pod="openstack/mariadb-client-6-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.764546 4822 scope.go:117] "RemoveContainer" containerID="f19a11301626fc440ddff88c409549a4156af9542a3172a9643bafb02954f395" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.764647 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.835128 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hztdz\" (UniqueName: \"kubernetes.io/projected/d2625a4c-b340-4745-bc3f-21877461b529-kube-api-access-hztdz\") pod \"mariadb-client-6-default\" (UID: \"d2625a4c-b340-4745-bc3f-21877461b529\") " pod="openstack/mariadb-client-6-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.867666 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztdz\" (UniqueName: \"kubernetes.io/projected/d2625a4c-b340-4745-bc3f-21877461b529-kube-api-access-hztdz\") pod \"mariadb-client-6-default\" (UID: \"d2625a4c-b340-4745-bc3f-21877461b529\") " pod="openstack/mariadb-client-6-default" Oct 10 07:43:53 crc kubenswrapper[4822]: I1010 07:43:53.923926 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 07:43:54 crc kubenswrapper[4822]: I1010 07:43:54.530133 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 07:43:54 crc kubenswrapper[4822]: I1010 07:43:54.778089 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d2625a4c-b340-4745-bc3f-21877461b529","Type":"ContainerStarted","Data":"e612fae1f489ada2db51e4937445cdf14948e7431f52306dd838e1f1502ee45c"} Oct 10 07:43:54 crc kubenswrapper[4822]: I1010 07:43:54.778435 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d2625a4c-b340-4745-bc3f-21877461b529","Type":"ContainerStarted","Data":"151ed40b298f202cd9de7f554da8caef1519baade1c6bc83402fe4acafe5ec97"} Oct 10 07:43:54 crc kubenswrapper[4822]: I1010 07:43:54.811485 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.811461128 podStartE2EDuration="1.811461128s" podCreationTimestamp="2025-10-10 07:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:43:54.7966419 +0000 UTC m=+4781.891800126" watchObservedRunningTime="2025-10-10 07:43:54.811461128 +0000 UTC m=+4781.906619324" Oct 10 07:43:55 crc kubenswrapper[4822]: I1010 07:43:55.795375 4822 generic.go:334] "Generic (PLEG): container finished" podID="d2625a4c-b340-4745-bc3f-21877461b529" containerID="e612fae1f489ada2db51e4937445cdf14948e7431f52306dd838e1f1502ee45c" exitCode=0 Oct 10 07:43:55 crc kubenswrapper[4822]: I1010 07:43:55.795476 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d2625a4c-b340-4745-bc3f-21877461b529","Type":"ContainerDied","Data":"e612fae1f489ada2db51e4937445cdf14948e7431f52306dd838e1f1502ee45c"} Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.279171 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.323468 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.328045 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.396633 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hztdz\" (UniqueName: \"kubernetes.io/projected/d2625a4c-b340-4745-bc3f-21877461b529-kube-api-access-hztdz\") pod \"d2625a4c-b340-4745-bc3f-21877461b529\" (UID: \"d2625a4c-b340-4745-bc3f-21877461b529\") " Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.404833 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2625a4c-b340-4745-bc3f-21877461b529-kube-api-access-hztdz" (OuterVolumeSpecName: "kube-api-access-hztdz") pod "d2625a4c-b340-4745-bc3f-21877461b529" (UID: "d2625a4c-b340-4745-bc3f-21877461b529"). InnerVolumeSpecName "kube-api-access-hztdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.494253 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 07:43:57 crc kubenswrapper[4822]: E1010 07:43:57.494938 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2625a4c-b340-4745-bc3f-21877461b529" containerName="mariadb-client-6-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.495021 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2625a4c-b340-4745-bc3f-21877461b529" containerName="mariadb-client-6-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.495530 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2625a4c-b340-4745-bc3f-21877461b529" containerName="mariadb-client-6-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.496723 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.498101 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hztdz\" (UniqueName: \"kubernetes.io/projected/d2625a4c-b340-4745-bc3f-21877461b529-kube-api-access-hztdz\") on node \"crc\" DevicePath \"\"" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.503911 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.600002 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv9f8\" (UniqueName: \"kubernetes.io/projected/50d94843-6c3b-4683-871b-77769e9daa22-kube-api-access-hv9f8\") pod \"mariadb-client-7-default\" (UID: \"50d94843-6c3b-4683-871b-77769e9daa22\") " pod="openstack/mariadb-client-7-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.667643 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2625a4c-b340-4745-bc3f-21877461b529" path="/var/lib/kubelet/pods/d2625a4c-b340-4745-bc3f-21877461b529/volumes" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.702054 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv9f8\" (UniqueName: \"kubernetes.io/projected/50d94843-6c3b-4683-871b-77769e9daa22-kube-api-access-hv9f8\") pod \"mariadb-client-7-default\" (UID: \"50d94843-6c3b-4683-871b-77769e9daa22\") " pod="openstack/mariadb-client-7-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.733045 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv9f8\" (UniqueName: \"kubernetes.io/projected/50d94843-6c3b-4683-871b-77769e9daa22-kube-api-access-hv9f8\") pod \"mariadb-client-7-default\" (UID: \"50d94843-6c3b-4683-871b-77769e9daa22\") " pod="openstack/mariadb-client-7-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.819122 4822 scope.go:117] "RemoveContainer" containerID="e612fae1f489ada2db51e4937445cdf14948e7431f52306dd838e1f1502ee45c" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.819185 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 07:43:57 crc kubenswrapper[4822]: I1010 07:43:57.828233 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 07:43:58 crc kubenswrapper[4822]: I1010 07:43:58.495563 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 07:43:58 crc kubenswrapper[4822]: W1010 07:43:58.510425 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d94843_6c3b_4683_871b_77769e9daa22.slice/crio-13ecaba7fe06559cc01dac00b32545d856de258eb460d55b6f7ea594d043b784 WatchSource:0}: Error finding container 13ecaba7fe06559cc01dac00b32545d856de258eb460d55b6f7ea594d043b784: Status 404 returned error can't find the container with id 13ecaba7fe06559cc01dac00b32545d856de258eb460d55b6f7ea594d043b784 Oct 10 07:43:58 crc kubenswrapper[4822]: I1010 07:43:58.832980 4822 generic.go:334] "Generic (PLEG): container finished" podID="50d94843-6c3b-4683-871b-77769e9daa22" containerID="e9b2ac8cc4e0b990dd7fa5e62af3327420a07ac93c87c46a75698fed681f0998" exitCode=0 Oct 10 07:43:58 crc kubenswrapper[4822]: I1010 07:43:58.833027 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"50d94843-6c3b-4683-871b-77769e9daa22","Type":"ContainerDied","Data":"e9b2ac8cc4e0b990dd7fa5e62af3327420a07ac93c87c46a75698fed681f0998"} Oct 10 07:43:58 crc kubenswrapper[4822]: I1010 07:43:58.833288 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"50d94843-6c3b-4683-871b-77769e9daa22","Type":"ContainerStarted","Data":"13ecaba7fe06559cc01dac00b32545d856de258eb460d55b6f7ea594d043b784"} Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.329750 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.353846 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_50d94843-6c3b-4683-871b-77769e9daa22/mariadb-client-7-default/0.log" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.388034 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.392433 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.449628 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv9f8\" (UniqueName: \"kubernetes.io/projected/50d94843-6c3b-4683-871b-77769e9daa22-kube-api-access-hv9f8\") pod \"50d94843-6c3b-4683-871b-77769e9daa22\" (UID: \"50d94843-6c3b-4683-871b-77769e9daa22\") " Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.456104 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d94843-6c3b-4683-871b-77769e9daa22-kube-api-access-hv9f8" (OuterVolumeSpecName: "kube-api-access-hv9f8") pod "50d94843-6c3b-4683-871b-77769e9daa22" (UID: "50d94843-6c3b-4683-871b-77769e9daa22"). InnerVolumeSpecName "kube-api-access-hv9f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.551713 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv9f8\" (UniqueName: \"kubernetes.io/projected/50d94843-6c3b-4683-871b-77769e9daa22-kube-api-access-hv9f8\") on node \"crc\" DevicePath \"\"" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.600864 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 10 07:44:00 crc kubenswrapper[4822]: E1010 07:44:00.601785 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d94843-6c3b-4683-871b-77769e9daa22" containerName="mariadb-client-7-default" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.601853 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d94843-6c3b-4683-871b-77769e9daa22" containerName="mariadb-client-7-default" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.602221 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d94843-6c3b-4683-871b-77769e9daa22" containerName="mariadb-client-7-default" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.603118 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.612972 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.754905 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h28w\" (UniqueName: \"kubernetes.io/projected/17b4f069-5332-4d11-b017-2acb9a015432-kube-api-access-6h28w\") pod \"mariadb-client-2\" (UID: \"17b4f069-5332-4d11-b017-2acb9a015432\") " pod="openstack/mariadb-client-2" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.854640 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13ecaba7fe06559cc01dac00b32545d856de258eb460d55b6f7ea594d043b784" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.854729 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.857060 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h28w\" (UniqueName: \"kubernetes.io/projected/17b4f069-5332-4d11-b017-2acb9a015432-kube-api-access-6h28w\") pod \"mariadb-client-2\" (UID: \"17b4f069-5332-4d11-b017-2acb9a015432\") " pod="openstack/mariadb-client-2" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.887320 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h28w\" (UniqueName: \"kubernetes.io/projected/17b4f069-5332-4d11-b017-2acb9a015432-kube-api-access-6h28w\") pod \"mariadb-client-2\" (UID: \"17b4f069-5332-4d11-b017-2acb9a015432\") " pod="openstack/mariadb-client-2" Oct 10 07:44:00 crc kubenswrapper[4822]: I1010 07:44:00.937502 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.303879 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 07:44:01 crc kubenswrapper[4822]: W1010 07:44:01.317081 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17b4f069_5332_4d11_b017_2acb9a015432.slice/crio-62ae0e06e3ecee6c78aa2957ab1b5c00ba65252e2981eb66d5cba581f84894d1 WatchSource:0}: Error finding container 62ae0e06e3ecee6c78aa2957ab1b5c00ba65252e2981eb66d5cba581f84894d1: Status 404 returned error can't find the container with id 62ae0e06e3ecee6c78aa2957ab1b5c00ba65252e2981eb66d5cba581f84894d1 Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.339237 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.339316 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.667915 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d94843-6c3b-4683-871b-77769e9daa22" path="/var/lib/kubelet/pods/50d94843-6c3b-4683-871b-77769e9daa22/volumes" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.701313 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g5fxw"] Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.703909 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.717784 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5fxw"] Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.865468 4822 generic.go:334] "Generic (PLEG): container finished" podID="17b4f069-5332-4d11-b017-2acb9a015432" containerID="6362c87db1c53baa6cb2811941ba486326e9a96c3886f024867bee2025fe66dc" exitCode=0 Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.865534 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"17b4f069-5332-4d11-b017-2acb9a015432","Type":"ContainerDied","Data":"6362c87db1c53baa6cb2811941ba486326e9a96c3886f024867bee2025fe66dc"} Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.865571 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"17b4f069-5332-4d11-b017-2acb9a015432","Type":"ContainerStarted","Data":"62ae0e06e3ecee6c78aa2957ab1b5c00ba65252e2981eb66d5cba581f84894d1"} Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.878486 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j95kc\" (UniqueName: \"kubernetes.io/projected/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-kube-api-access-j95kc\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.878734 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-catalog-content\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.878931 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-utilities\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.980012 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j95kc\" (UniqueName: \"kubernetes.io/projected/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-kube-api-access-j95kc\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.980117 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-catalog-content\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.980159 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-utilities\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.980572 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-utilities\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:01 crc kubenswrapper[4822]: I1010 07:44:01.980654 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-catalog-content\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.002230 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j95kc\" (UniqueName: \"kubernetes.io/projected/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-kube-api-access-j95kc\") pod \"redhat-operators-g5fxw\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.041043 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.314624 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5fxw"] Oct 10 07:44:02 crc kubenswrapper[4822]: W1010 07:44:02.323350 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc2ba55_870c_4d13_801d_f5ed2d3c35c5.slice/crio-fcd3235380771e51840077f70d06ca568a685646640d4e40b18061fb59afffc4 WatchSource:0}: Error finding container fcd3235380771e51840077f70d06ca568a685646640d4e40b18061fb59afffc4: Status 404 returned error can't find the container with id fcd3235380771e51840077f70d06ca568a685646640d4e40b18061fb59afffc4 Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.876983 4822 generic.go:334] "Generic (PLEG): container finished" podID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerID="62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd" exitCode=0 Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.877191 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5fxw" event={"ID":"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5","Type":"ContainerDied","Data":"62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd"} Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.877367 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5fxw" event={"ID":"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5","Type":"ContainerStarted","Data":"fcd3235380771e51840077f70d06ca568a685646640d4e40b18061fb59afffc4"} Oct 10 07:44:02 crc kubenswrapper[4822]: I1010 07:44:02.879638 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.263883 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.279907 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_17b4f069-5332-4d11-b017-2acb9a015432/mariadb-client-2/0.log" Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.322357 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h28w\" (UniqueName: \"kubernetes.io/projected/17b4f069-5332-4d11-b017-2acb9a015432-kube-api-access-6h28w\") pod \"17b4f069-5332-4d11-b017-2acb9a015432\" (UID: \"17b4f069-5332-4d11-b017-2acb9a015432\") " Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.327993 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b4f069-5332-4d11-b017-2acb9a015432-kube-api-access-6h28w" (OuterVolumeSpecName: "kube-api-access-6h28w") pod "17b4f069-5332-4d11-b017-2acb9a015432" (UID: "17b4f069-5332-4d11-b017-2acb9a015432"). InnerVolumeSpecName "kube-api-access-6h28w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.344317 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.349047 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.424011 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h28w\" (UniqueName: \"kubernetes.io/projected/17b4f069-5332-4d11-b017-2acb9a015432-kube-api-access-6h28w\") on node \"crc\" DevicePath \"\"" Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.668278 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b4f069-5332-4d11-b017-2acb9a015432" path="/var/lib/kubelet/pods/17b4f069-5332-4d11-b017-2acb9a015432/volumes" Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.886232 4822 scope.go:117] "RemoveContainer" containerID="6362c87db1c53baa6cb2811941ba486326e9a96c3886f024867bee2025fe66dc" Oct 10 07:44:03 crc kubenswrapper[4822]: I1010 07:44:03.886321 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 07:44:04 crc kubenswrapper[4822]: I1010 07:44:04.897544 4822 generic.go:334] "Generic (PLEG): container finished" podID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerID="b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa" exitCode=0 Oct 10 07:44:04 crc kubenswrapper[4822]: I1010 07:44:04.897636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5fxw" event={"ID":"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5","Type":"ContainerDied","Data":"b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa"} Oct 10 07:44:06 crc kubenswrapper[4822]: I1010 07:44:06.920346 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5fxw" event={"ID":"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5","Type":"ContainerStarted","Data":"b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100"} Oct 10 07:44:06 crc kubenswrapper[4822]: I1010 07:44:06.946774 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g5fxw" podStartSLOduration=2.911865679 podStartE2EDuration="5.946756943s" podCreationTimestamp="2025-10-10 07:44:01 +0000 UTC" firstStartedPulling="2025-10-10 07:44:02.879441587 +0000 UTC m=+4789.974599783" lastFinishedPulling="2025-10-10 07:44:05.914332811 +0000 UTC m=+4793.009491047" observedRunningTime="2025-10-10 07:44:06.942061588 +0000 UTC m=+4794.037219784" watchObservedRunningTime="2025-10-10 07:44:06.946756943 +0000 UTC m=+4794.041915139" Oct 10 07:44:12 crc kubenswrapper[4822]: I1010 07:44:12.041859 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:12 crc kubenswrapper[4822]: I1010 07:44:12.042699 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:12 crc kubenswrapper[4822]: I1010 07:44:12.122950 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:13 crc kubenswrapper[4822]: I1010 07:44:13.032558 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:13 crc kubenswrapper[4822]: I1010 07:44:13.100579 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5fxw"] Oct 10 07:44:14 crc kubenswrapper[4822]: I1010 07:44:14.989467 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g5fxw" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="registry-server" containerID="cri-o://b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100" gracePeriod=2 Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.488164 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.653242 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-catalog-content\") pod \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.653429 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j95kc\" (UniqueName: \"kubernetes.io/projected/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-kube-api-access-j95kc\") pod \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.653502 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-utilities\") pod \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\" (UID: \"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5\") " Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.654694 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-utilities" (OuterVolumeSpecName: "utilities") pod "5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" (UID: "5fc2ba55-870c-4d13-801d-f5ed2d3c35c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.669316 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-kube-api-access-j95kc" (OuterVolumeSpecName: "kube-api-access-j95kc") pod "5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" (UID: "5fc2ba55-870c-4d13-801d-f5ed2d3c35c5"). InnerVolumeSpecName "kube-api-access-j95kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.756438 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j95kc\" (UniqueName: \"kubernetes.io/projected/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-kube-api-access-j95kc\") on node \"crc\" DevicePath \"\"" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.756507 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.757363 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" (UID: "5fc2ba55-870c-4d13-801d-f5ed2d3c35c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.857951 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.998480 4822 generic.go:334] "Generic (PLEG): container finished" podID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerID="b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100" exitCode=0 Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.998671 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5fxw" event={"ID":"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5","Type":"ContainerDied","Data":"b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100"} Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.999310 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5fxw" event={"ID":"5fc2ba55-870c-4d13-801d-f5ed2d3c35c5","Type":"ContainerDied","Data":"fcd3235380771e51840077f70d06ca568a685646640d4e40b18061fb59afffc4"} Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.999386 4822 scope.go:117] "RemoveContainer" containerID="b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100" Oct 10 07:44:15 crc kubenswrapper[4822]: I1010 07:44:15.998834 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5fxw" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.028627 4822 scope.go:117] "RemoveContainer" containerID="b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.035672 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5fxw"] Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.041925 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g5fxw"] Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.287184 4822 scope.go:117] "RemoveContainer" containerID="62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.327523 4822 scope.go:117] "RemoveContainer" containerID="b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100" Oct 10 07:44:16 crc kubenswrapper[4822]: E1010 07:44:16.327944 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100\": container with ID starting with b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100 not found: ID does not exist" containerID="b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.327978 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100"} err="failed to get container status \"b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100\": rpc error: code = NotFound desc = could not find container \"b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100\": container with ID starting with b9440f6317ce4189611d5724150bef527e5e44a42bd20d522bc54a7007b1b100 not found: ID does not exist" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.327998 4822 scope.go:117] "RemoveContainer" containerID="b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa" Oct 10 07:44:16 crc kubenswrapper[4822]: E1010 07:44:16.328615 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa\": container with ID starting with b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa not found: ID does not exist" containerID="b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.328831 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa"} err="failed to get container status \"b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa\": rpc error: code = NotFound desc = could not find container \"b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa\": container with ID starting with b96b999bc417c47c5af331d184d13017dfd47bc538ba0dbd659e7a62d3eb1cfa not found: ID does not exist" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.329038 4822 scope.go:117] "RemoveContainer" containerID="62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd" Oct 10 07:44:16 crc kubenswrapper[4822]: E1010 07:44:16.329499 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd\": container with ID starting with 62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd not found: ID does not exist" containerID="62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd" Oct 10 07:44:16 crc kubenswrapper[4822]: I1010 07:44:16.329523 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd"} err="failed to get container status \"62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd\": rpc error: code = NotFound desc = could not find container \"62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd\": container with ID starting with 62223082e57f220654aaeaa0e956a381e61e176339aa98558dbf8b8e4b8f3efd not found: ID does not exist" Oct 10 07:44:17 crc kubenswrapper[4822]: I1010 07:44:17.667654 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" path="/var/lib/kubelet/pods/5fc2ba55-870c-4d13-801d-f5ed2d3c35c5/volumes" Oct 10 07:44:22 crc kubenswrapper[4822]: I1010 07:44:22.561320 4822 scope.go:117] "RemoveContainer" containerID="bc2bc23f3ce8594ba4a536947b02565e2d300edf4557017316b62e03ed3cf795" Oct 10 07:44:31 crc kubenswrapper[4822]: I1010 07:44:31.336606 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:44:31 crc kubenswrapper[4822]: I1010 07:44:31.337296 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:44:31 crc kubenswrapper[4822]: I1010 07:44:31.337385 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:44:31 crc kubenswrapper[4822]: I1010 07:44:31.338434 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2398a8e542386da14b10b31ffe04ff05e312cb55595e888f7591cd86fcaa9dae"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:44:31 crc kubenswrapper[4822]: I1010 07:44:31.338552 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://2398a8e542386da14b10b31ffe04ff05e312cb55595e888f7591cd86fcaa9dae" gracePeriod=600 Oct 10 07:44:32 crc kubenswrapper[4822]: I1010 07:44:32.164322 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="2398a8e542386da14b10b31ffe04ff05e312cb55595e888f7591cd86fcaa9dae" exitCode=0 Oct 10 07:44:32 crc kubenswrapper[4822]: I1010 07:44:32.164375 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"2398a8e542386da14b10b31ffe04ff05e312cb55595e888f7591cd86fcaa9dae"} Oct 10 07:44:32 crc kubenswrapper[4822]: I1010 07:44:32.164413 4822 scope.go:117] "RemoveContainer" containerID="36480b33be9d3617558f13318493c61206db6fbd47eb0678a2a578869db4d3cc" Oct 10 07:44:33 crc kubenswrapper[4822]: I1010 07:44:33.177509 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7"} Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.169321 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f"] Oct 10 07:45:00 crc kubenswrapper[4822]: E1010 07:45:00.170916 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="extract-content" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.170943 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="extract-content" Oct 10 07:45:00 crc kubenswrapper[4822]: E1010 07:45:00.170978 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="registry-server" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.170990 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="registry-server" Oct 10 07:45:00 crc kubenswrapper[4822]: E1010 07:45:00.171039 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="extract-utilities" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.171054 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="extract-utilities" Oct 10 07:45:00 crc kubenswrapper[4822]: E1010 07:45:00.171072 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b4f069-5332-4d11-b017-2acb9a015432" containerName="mariadb-client-2" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.171084 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b4f069-5332-4d11-b017-2acb9a015432" containerName="mariadb-client-2" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.171558 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc2ba55-870c-4d13-801d-f5ed2d3c35c5" containerName="registry-server" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.171576 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b4f069-5332-4d11-b017-2acb9a015432" containerName="mariadb-client-2" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.172583 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.177030 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.177352 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.186331 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f"] Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.230855 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e67f756-5528-4a8d-99f3-bec56fefc38f-config-volume\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.230906 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4krpz\" (UniqueName: \"kubernetes.io/projected/0e67f756-5528-4a8d-99f3-bec56fefc38f-kube-api-access-4krpz\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.231169 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e67f756-5528-4a8d-99f3-bec56fefc38f-secret-volume\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.333067 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e67f756-5528-4a8d-99f3-bec56fefc38f-config-volume\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.333521 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4krpz\" (UniqueName: \"kubernetes.io/projected/0e67f756-5528-4a8d-99f3-bec56fefc38f-kube-api-access-4krpz\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.333904 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e67f756-5528-4a8d-99f3-bec56fefc38f-secret-volume\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.334681 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e67f756-5528-4a8d-99f3-bec56fefc38f-config-volume\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.344966 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e67f756-5528-4a8d-99f3-bec56fefc38f-secret-volume\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.368919 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4krpz\" (UniqueName: \"kubernetes.io/projected/0e67f756-5528-4a8d-99f3-bec56fefc38f-kube-api-access-4krpz\") pod \"collect-profiles-29334705-fj56f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.510108 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:00 crc kubenswrapper[4822]: I1010 07:45:00.977734 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f"] Oct 10 07:45:01 crc kubenswrapper[4822]: I1010 07:45:01.462144 4822 generic.go:334] "Generic (PLEG): container finished" podID="0e67f756-5528-4a8d-99f3-bec56fefc38f" containerID="303e38474fea639c35df7651edc3d1ecd052c0b996fe65235b15aa4722f1ec97" exitCode=0 Oct 10 07:45:01 crc kubenswrapper[4822]: I1010 07:45:01.462238 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" event={"ID":"0e67f756-5528-4a8d-99f3-bec56fefc38f","Type":"ContainerDied","Data":"303e38474fea639c35df7651edc3d1ecd052c0b996fe65235b15aa4722f1ec97"} Oct 10 07:45:01 crc kubenswrapper[4822]: I1010 07:45:01.463087 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" event={"ID":"0e67f756-5528-4a8d-99f3-bec56fefc38f","Type":"ContainerStarted","Data":"a0caa63040577d2f09e697e05d609548b6a5d2f58db49f1bbab39c48b6b9e18d"} Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.767616 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.778843 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4krpz\" (UniqueName: \"kubernetes.io/projected/0e67f756-5528-4a8d-99f3-bec56fefc38f-kube-api-access-4krpz\") pod \"0e67f756-5528-4a8d-99f3-bec56fefc38f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.778926 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e67f756-5528-4a8d-99f3-bec56fefc38f-config-volume\") pod \"0e67f756-5528-4a8d-99f3-bec56fefc38f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.778955 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e67f756-5528-4a8d-99f3-bec56fefc38f-secret-volume\") pod \"0e67f756-5528-4a8d-99f3-bec56fefc38f\" (UID: \"0e67f756-5528-4a8d-99f3-bec56fefc38f\") " Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.783966 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e67f756-5528-4a8d-99f3-bec56fefc38f-config-volume" (OuterVolumeSpecName: "config-volume") pod "0e67f756-5528-4a8d-99f3-bec56fefc38f" (UID: "0e67f756-5528-4a8d-99f3-bec56fefc38f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.788031 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e67f756-5528-4a8d-99f3-bec56fefc38f-kube-api-access-4krpz" (OuterVolumeSpecName: "kube-api-access-4krpz") pod "0e67f756-5528-4a8d-99f3-bec56fefc38f" (UID: "0e67f756-5528-4a8d-99f3-bec56fefc38f"). InnerVolumeSpecName "kube-api-access-4krpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.788656 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e67f756-5528-4a8d-99f3-bec56fefc38f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0e67f756-5528-4a8d-99f3-bec56fefc38f" (UID: "0e67f756-5528-4a8d-99f3-bec56fefc38f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.880205 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4krpz\" (UniqueName: \"kubernetes.io/projected/0e67f756-5528-4a8d-99f3-bec56fefc38f-kube-api-access-4krpz\") on node \"crc\" DevicePath \"\"" Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.880242 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e67f756-5528-4a8d-99f3-bec56fefc38f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:45:02 crc kubenswrapper[4822]: I1010 07:45:02.880257 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e67f756-5528-4a8d-99f3-bec56fefc38f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:45:03 crc kubenswrapper[4822]: I1010 07:45:03.481875 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" event={"ID":"0e67f756-5528-4a8d-99f3-bec56fefc38f","Type":"ContainerDied","Data":"a0caa63040577d2f09e697e05d609548b6a5d2f58db49f1bbab39c48b6b9e18d"} Oct 10 07:45:03 crc kubenswrapper[4822]: I1010 07:45:03.481937 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0caa63040577d2f09e697e05d609548b6a5d2f58db49f1bbab39c48b6b9e18d" Oct 10 07:45:03 crc kubenswrapper[4822]: I1010 07:45:03.481957 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f" Oct 10 07:45:03 crc kubenswrapper[4822]: I1010 07:45:03.878491 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp"] Oct 10 07:45:03 crc kubenswrapper[4822]: I1010 07:45:03.884147 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-77gvp"] Oct 10 07:45:05 crc kubenswrapper[4822]: I1010 07:45:05.669212 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1" path="/var/lib/kubelet/pods/0ce13cdc-4e7e-4cd7-9e6b-0ef12eaa2fa1/volumes" Oct 10 07:45:22 crc kubenswrapper[4822]: I1010 07:45:22.716457 4822 scope.go:117] "RemoveContainer" containerID="bcb25ed755addb1e134da06154c34459f4a5c0bd8dae051dce5d4c69d16ff75d" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.104707 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2kc96"] Oct 10 07:46:23 crc kubenswrapper[4822]: E1010 07:46:23.105718 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e67f756-5528-4a8d-99f3-bec56fefc38f" containerName="collect-profiles" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.105734 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e67f756-5528-4a8d-99f3-bec56fefc38f" containerName="collect-profiles" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.106106 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e67f756-5528-4a8d-99f3-bec56fefc38f" containerName="collect-profiles" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.107445 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.121293 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kc96"] Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.193180 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrlf\" (UniqueName: \"kubernetes.io/projected/bfe6c53c-e820-408d-a2bc-c91b82cf0174-kube-api-access-7zrlf\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.193310 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-utilities\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.193407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-catalog-content\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.295109 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-catalog-content\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.295202 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrlf\" (UniqueName: \"kubernetes.io/projected/bfe6c53c-e820-408d-a2bc-c91b82cf0174-kube-api-access-7zrlf\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.295246 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-utilities\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.295693 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-utilities\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.295973 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-catalog-content\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.313480 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrlf\" (UniqueName: \"kubernetes.io/projected/bfe6c53c-e820-408d-a2bc-c91b82cf0174-kube-api-access-7zrlf\") pod \"certified-operators-2kc96\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.472050 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:23 crc kubenswrapper[4822]: I1010 07:46:23.895123 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kc96"] Oct 10 07:46:24 crc kubenswrapper[4822]: I1010 07:46:24.292735 4822 generic.go:334] "Generic (PLEG): container finished" podID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerID="9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de" exitCode=0 Oct 10 07:46:24 crc kubenswrapper[4822]: I1010 07:46:24.292935 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerDied","Data":"9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de"} Oct 10 07:46:24 crc kubenswrapper[4822]: I1010 07:46:24.293271 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerStarted","Data":"647b1a98af05721efcc6b2d26b77b72ca7ef6c581337e422af8042af8e9682cb"} Oct 10 07:46:25 crc kubenswrapper[4822]: I1010 07:46:25.309951 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerStarted","Data":"287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e"} Oct 10 07:46:26 crc kubenswrapper[4822]: I1010 07:46:26.322927 4822 generic.go:334] "Generic (PLEG): container finished" podID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerID="287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e" exitCode=0 Oct 10 07:46:26 crc kubenswrapper[4822]: I1010 07:46:26.322966 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerDied","Data":"287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e"} Oct 10 07:46:27 crc kubenswrapper[4822]: I1010 07:46:27.334249 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerStarted","Data":"8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c"} Oct 10 07:46:27 crc kubenswrapper[4822]: I1010 07:46:27.367077 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2kc96" podStartSLOduration=1.850125872 podStartE2EDuration="4.36705885s" podCreationTimestamp="2025-10-10 07:46:23 +0000 UTC" firstStartedPulling="2025-10-10 07:46:24.295170882 +0000 UTC m=+4931.390329128" lastFinishedPulling="2025-10-10 07:46:26.81210387 +0000 UTC m=+4933.907262106" observedRunningTime="2025-10-10 07:46:27.359320737 +0000 UTC m=+4934.454478953" watchObservedRunningTime="2025-10-10 07:46:27.36705885 +0000 UTC m=+4934.462217086" Oct 10 07:46:33 crc kubenswrapper[4822]: I1010 07:46:33.472863 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:33 crc kubenswrapper[4822]: I1010 07:46:33.473367 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:33 crc kubenswrapper[4822]: I1010 07:46:33.552792 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:34 crc kubenswrapper[4822]: I1010 07:46:34.447858 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:34 crc kubenswrapper[4822]: I1010 07:46:34.507169 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2kc96"] Oct 10 07:46:36 crc kubenswrapper[4822]: I1010 07:46:36.426358 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2kc96" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="registry-server" containerID="cri-o://8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c" gracePeriod=2 Oct 10 07:46:36 crc kubenswrapper[4822]: I1010 07:46:36.906845 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.089041 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrlf\" (UniqueName: \"kubernetes.io/projected/bfe6c53c-e820-408d-a2bc-c91b82cf0174-kube-api-access-7zrlf\") pod \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.089150 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-utilities\") pod \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.089185 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-catalog-content\") pod \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\" (UID: \"bfe6c53c-e820-408d-a2bc-c91b82cf0174\") " Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.090378 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-utilities" (OuterVolumeSpecName: "utilities") pod "bfe6c53c-e820-408d-a2bc-c91b82cf0174" (UID: "bfe6c53c-e820-408d-a2bc-c91b82cf0174"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.098443 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe6c53c-e820-408d-a2bc-c91b82cf0174-kube-api-access-7zrlf" (OuterVolumeSpecName: "kube-api-access-7zrlf") pod "bfe6c53c-e820-408d-a2bc-c91b82cf0174" (UID: "bfe6c53c-e820-408d-a2bc-c91b82cf0174"). InnerVolumeSpecName "kube-api-access-7zrlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.148819 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfe6c53c-e820-408d-a2bc-c91b82cf0174" (UID: "bfe6c53c-e820-408d-a2bc-c91b82cf0174"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.192491 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrlf\" (UniqueName: \"kubernetes.io/projected/bfe6c53c-e820-408d-a2bc-c91b82cf0174-kube-api-access-7zrlf\") on node \"crc\" DevicePath \"\"" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.192554 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.192570 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe6c53c-e820-408d-a2bc-c91b82cf0174-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.442590 4822 generic.go:334] "Generic (PLEG): container finished" podID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerID="8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c" exitCode=0 Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.442658 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerDied","Data":"8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c"} Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.442701 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kc96" event={"ID":"bfe6c53c-e820-408d-a2bc-c91b82cf0174","Type":"ContainerDied","Data":"647b1a98af05721efcc6b2d26b77b72ca7ef6c581337e422af8042af8e9682cb"} Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.442729 4822 scope.go:117] "RemoveContainer" containerID="8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.444265 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kc96" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.478316 4822 scope.go:117] "RemoveContainer" containerID="287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.531934 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2kc96"] Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.540266 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2kc96"] Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.542956 4822 scope.go:117] "RemoveContainer" containerID="9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.564464 4822 scope.go:117] "RemoveContainer" containerID="8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c" Oct 10 07:46:37 crc kubenswrapper[4822]: E1010 07:46:37.565076 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c\": container with ID starting with 8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c not found: ID does not exist" containerID="8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.565116 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c"} err="failed to get container status \"8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c\": rpc error: code = NotFound desc = could not find container \"8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c\": container with ID starting with 8c8d483a7f94ec7cc4b6713d2aa1837aa34c53772ff1ec3bb60b4ad16575606c not found: ID does not exist" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.565145 4822 scope.go:117] "RemoveContainer" containerID="287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e" Oct 10 07:46:37 crc kubenswrapper[4822]: E1010 07:46:37.566074 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e\": container with ID starting with 287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e not found: ID does not exist" containerID="287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.566153 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e"} err="failed to get container status \"287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e\": rpc error: code = NotFound desc = could not find container \"287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e\": container with ID starting with 287e326c8340885871291d2830c1c1f87c8eab2d550029b8e97c64300407838e not found: ID does not exist" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.566209 4822 scope.go:117] "RemoveContainer" containerID="9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de" Oct 10 07:46:37 crc kubenswrapper[4822]: E1010 07:46:37.566599 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de\": container with ID starting with 9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de not found: ID does not exist" containerID="9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.566639 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de"} err="failed to get container status \"9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de\": rpc error: code = NotFound desc = could not find container \"9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de\": container with ID starting with 9675aac25a27427c8f03e6ef0ac4d5cfa05fc5842f7dc85745eceb81e63bb7de not found: ID does not exist" Oct 10 07:46:37 crc kubenswrapper[4822]: I1010 07:46:37.662225 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" path="/var/lib/kubelet/pods/bfe6c53c-e820-408d-a2bc-c91b82cf0174/volumes" Oct 10 07:47:01 crc kubenswrapper[4822]: I1010 07:47:01.337296 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:47:01 crc kubenswrapper[4822]: I1010 07:47:01.337975 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:47:31 crc kubenswrapper[4822]: I1010 07:47:31.337079 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:47:31 crc kubenswrapper[4822]: I1010 07:47:31.337909 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:48:01 crc kubenswrapper[4822]: I1010 07:48:01.336519 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:48:01 crc kubenswrapper[4822]: I1010 07:48:01.337065 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:48:01 crc kubenswrapper[4822]: I1010 07:48:01.337112 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:48:01 crc kubenswrapper[4822]: I1010 07:48:01.338023 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:48:01 crc kubenswrapper[4822]: I1010 07:48:01.338114 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" gracePeriod=600 Oct 10 07:48:01 crc kubenswrapper[4822]: E1010 07:48:01.464418 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:48:02 crc kubenswrapper[4822]: I1010 07:48:02.297139 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" exitCode=0 Oct 10 07:48:02 crc kubenswrapper[4822]: I1010 07:48:02.297234 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7"} Oct 10 07:48:02 crc kubenswrapper[4822]: I1010 07:48:02.297610 4822 scope.go:117] "RemoveContainer" containerID="2398a8e542386da14b10b31ffe04ff05e312cb55595e888f7591cd86fcaa9dae" Oct 10 07:48:02 crc kubenswrapper[4822]: I1010 07:48:02.298257 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:48:02 crc kubenswrapper[4822]: E1010 07:48:02.298568 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:48:16 crc kubenswrapper[4822]: I1010 07:48:16.651312 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:48:16 crc kubenswrapper[4822]: E1010 07:48:16.652214 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:48:22 crc kubenswrapper[4822]: I1010 07:48:22.864030 4822 scope.go:117] "RemoveContainer" containerID="3e47d42ea7a8a1c83d3749a5f423c560d2632c77e57662e21e61300f45cf9902" Oct 10 07:48:22 crc kubenswrapper[4822]: I1010 07:48:22.898284 4822 scope.go:117] "RemoveContainer" containerID="6dcc46307d77a1f15c0928a4eabe9f7ded3784f4af62beb4cd556c2205346219" Oct 10 07:48:31 crc kubenswrapper[4822]: I1010 07:48:31.651228 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:48:31 crc kubenswrapper[4822]: E1010 07:48:31.652249 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.943052 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 07:48:38 crc kubenswrapper[4822]: E1010 07:48:38.943851 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="extract-content" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.943874 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="extract-content" Oct 10 07:48:38 crc kubenswrapper[4822]: E1010 07:48:38.943904 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="extract-utilities" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.943916 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="extract-utilities" Oct 10 07:48:38 crc kubenswrapper[4822]: E1010 07:48:38.943937 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="registry-server" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.943949 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="registry-server" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.944197 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe6c53c-e820-408d-a2bc-c91b82cf0174" containerName="registry-server" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.945033 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.947842 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-v2br5" Oct 10 07:48:38 crc kubenswrapper[4822]: I1010 07:48:38.956116 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.115460 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.115578 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22lc\" (UniqueName: \"kubernetes.io/projected/6e167c28-a75e-4df7-b218-c5b29939fa82-kube-api-access-w22lc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.216988 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.217081 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22lc\" (UniqueName: \"kubernetes.io/projected/6e167c28-a75e-4df7-b218-c5b29939fa82-kube-api-access-w22lc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.220300 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.220327 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ec06767847d14fc5f99e341c9a141deb80339d191eba83ca5aed4a5cc99cf4e5/globalmount\"" pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.244890 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22lc\" (UniqueName: \"kubernetes.io/projected/6e167c28-a75e-4df7-b218-c5b29939fa82-kube-api-access-w22lc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.266513 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") pod \"mariadb-copy-data\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.277449 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 07:48:39 crc kubenswrapper[4822]: I1010 07:48:39.832500 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 07:48:40 crc kubenswrapper[4822]: I1010 07:48:40.751400 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"6e167c28-a75e-4df7-b218-c5b29939fa82","Type":"ContainerStarted","Data":"7cc3f7a987ae0eb589309b3b9ea2f78701ee5ede22eaa30cfb8bdc7a2a24c3b1"} Oct 10 07:48:40 crc kubenswrapper[4822]: I1010 07:48:40.751908 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"6e167c28-a75e-4df7-b218-c5b29939fa82","Type":"ContainerStarted","Data":"053213e0bfdd444e4b5fe3374e814b64343cda59cc40280950c1ca0c140c5692"} Oct 10 07:48:40 crc kubenswrapper[4822]: I1010 07:48:40.783725 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.783692919 podStartE2EDuration="3.783692919s" podCreationTimestamp="2025-10-10 07:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:48:40.772178117 +0000 UTC m=+5067.867336313" watchObservedRunningTime="2025-10-10 07:48:40.783692919 +0000 UTC m=+5067.878851155" Oct 10 07:48:41 crc kubenswrapper[4822]: E1010 07:48:41.500056 4822 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.180:34280->38.102.83.180:44473: write tcp 38.102.83.180:34280->38.102.83.180:44473: write: broken pipe Oct 10 07:48:42 crc kubenswrapper[4822]: I1010 07:48:42.917904 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:42 crc kubenswrapper[4822]: I1010 07:48:42.919220 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:42 crc kubenswrapper[4822]: I1010 07:48:42.927185 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:42 crc kubenswrapper[4822]: I1010 07:48:42.983452 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzs7h\" (UniqueName: \"kubernetes.io/projected/59235bc3-b7ce-4eeb-a69d-da836cfde631-kube-api-access-qzs7h\") pod \"mariadb-client\" (UID: \"59235bc3-b7ce-4eeb-a69d-da836cfde631\") " pod="openstack/mariadb-client" Oct 10 07:48:43 crc kubenswrapper[4822]: I1010 07:48:43.085067 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzs7h\" (UniqueName: \"kubernetes.io/projected/59235bc3-b7ce-4eeb-a69d-da836cfde631-kube-api-access-qzs7h\") pod \"mariadb-client\" (UID: \"59235bc3-b7ce-4eeb-a69d-da836cfde631\") " pod="openstack/mariadb-client" Oct 10 07:48:43 crc kubenswrapper[4822]: I1010 07:48:43.119786 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzs7h\" (UniqueName: \"kubernetes.io/projected/59235bc3-b7ce-4eeb-a69d-da836cfde631-kube-api-access-qzs7h\") pod \"mariadb-client\" (UID: \"59235bc3-b7ce-4eeb-a69d-da836cfde631\") " pod="openstack/mariadb-client" Oct 10 07:48:43 crc kubenswrapper[4822]: I1010 07:48:43.237758 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:43 crc kubenswrapper[4822]: I1010 07:48:43.770202 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:43 crc kubenswrapper[4822]: W1010 07:48:43.773344 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59235bc3_b7ce_4eeb_a69d_da836cfde631.slice/crio-9bc0d532713c6433c832b2360a6fd80e3c490ea4ae401893c17d2b75ac32e785 WatchSource:0}: Error finding container 9bc0d532713c6433c832b2360a6fd80e3c490ea4ae401893c17d2b75ac32e785: Status 404 returned error can't find the container with id 9bc0d532713c6433c832b2360a6fd80e3c490ea4ae401893c17d2b75ac32e785 Oct 10 07:48:43 crc kubenswrapper[4822]: I1010 07:48:43.783121 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"59235bc3-b7ce-4eeb-a69d-da836cfde631","Type":"ContainerStarted","Data":"9bc0d532713c6433c832b2360a6fd80e3c490ea4ae401893c17d2b75ac32e785"} Oct 10 07:48:44 crc kubenswrapper[4822]: I1010 07:48:44.795494 4822 generic.go:334] "Generic (PLEG): container finished" podID="59235bc3-b7ce-4eeb-a69d-da836cfde631" containerID="291bc0da022ba967a47d0b2c9ad816634c6acd21bdc586913388461604e80e95" exitCode=0 Oct 10 07:48:44 crc kubenswrapper[4822]: I1010 07:48:44.795647 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"59235bc3-b7ce-4eeb-a69d-da836cfde631","Type":"ContainerDied","Data":"291bc0da022ba967a47d0b2c9ad816634c6acd21bdc586913388461604e80e95"} Oct 10 07:48:45 crc kubenswrapper[4822]: I1010 07:48:45.651154 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:48:45 crc kubenswrapper[4822]: E1010 07:48:45.652671 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.118643 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.140233 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_59235bc3-b7ce-4eeb-a69d-da836cfde631/mariadb-client/0.log" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.178083 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.182768 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.243062 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzs7h\" (UniqueName: \"kubernetes.io/projected/59235bc3-b7ce-4eeb-a69d-da836cfde631-kube-api-access-qzs7h\") pod \"59235bc3-b7ce-4eeb-a69d-da836cfde631\" (UID: \"59235bc3-b7ce-4eeb-a69d-da836cfde631\") " Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.250464 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59235bc3-b7ce-4eeb-a69d-da836cfde631-kube-api-access-qzs7h" (OuterVolumeSpecName: "kube-api-access-qzs7h") pod "59235bc3-b7ce-4eeb-a69d-da836cfde631" (UID: "59235bc3-b7ce-4eeb-a69d-da836cfde631"). InnerVolumeSpecName "kube-api-access-qzs7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.314367 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:46 crc kubenswrapper[4822]: E1010 07:48:46.315041 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59235bc3-b7ce-4eeb-a69d-da836cfde631" containerName="mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.315068 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="59235bc3-b7ce-4eeb-a69d-da836cfde631" containerName="mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.315302 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="59235bc3-b7ce-4eeb-a69d-da836cfde631" containerName="mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.316140 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.324046 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.344877 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzs7h\" (UniqueName: \"kubernetes.io/projected/59235bc3-b7ce-4eeb-a69d-da836cfde631-kube-api-access-qzs7h\") on node \"crc\" DevicePath \"\"" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.446133 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5v9f\" (UniqueName: \"kubernetes.io/projected/03a26b14-6f10-4b26-860c-41d14a2158ae-kube-api-access-m5v9f\") pod \"mariadb-client\" (UID: \"03a26b14-6f10-4b26-860c-41d14a2158ae\") " pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.548726 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5v9f\" (UniqueName: \"kubernetes.io/projected/03a26b14-6f10-4b26-860c-41d14a2158ae-kube-api-access-m5v9f\") pod \"mariadb-client\" (UID: \"03a26b14-6f10-4b26-860c-41d14a2158ae\") " pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.568927 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5v9f\" (UniqueName: \"kubernetes.io/projected/03a26b14-6f10-4b26-860c-41d14a2158ae-kube-api-access-m5v9f\") pod \"mariadb-client\" (UID: \"03a26b14-6f10-4b26-860c-41d14a2158ae\") " pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.641235 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.826915 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc0d532713c6433c832b2360a6fd80e3c490ea4ae401893c17d2b75ac32e785" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.827217 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.853258 4822 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="59235bc3-b7ce-4eeb-a69d-da836cfde631" podUID="03a26b14-6f10-4b26-860c-41d14a2158ae" Oct 10 07:48:46 crc kubenswrapper[4822]: I1010 07:48:46.944560 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:46 crc kubenswrapper[4822]: W1010 07:48:46.952302 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03a26b14_6f10_4b26_860c_41d14a2158ae.slice/crio-b5955353d730900320b8d6b360b85c1cc55dbcf99a0d59c91bcc81b1a0a1d080 WatchSource:0}: Error finding container b5955353d730900320b8d6b360b85c1cc55dbcf99a0d59c91bcc81b1a0a1d080: Status 404 returned error can't find the container with id b5955353d730900320b8d6b360b85c1cc55dbcf99a0d59c91bcc81b1a0a1d080 Oct 10 07:48:47 crc kubenswrapper[4822]: I1010 07:48:47.660924 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59235bc3-b7ce-4eeb-a69d-da836cfde631" path="/var/lib/kubelet/pods/59235bc3-b7ce-4eeb-a69d-da836cfde631/volumes" Oct 10 07:48:47 crc kubenswrapper[4822]: I1010 07:48:47.837914 4822 generic.go:334] "Generic (PLEG): container finished" podID="03a26b14-6f10-4b26-860c-41d14a2158ae" containerID="36a9fff97d6b5cc02301a1c5bc3a2fb56fc91d4a1777ff686cc53cd7c40ce170" exitCode=0 Oct 10 07:48:47 crc kubenswrapper[4822]: I1010 07:48:47.838060 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"03a26b14-6f10-4b26-860c-41d14a2158ae","Type":"ContainerDied","Data":"36a9fff97d6b5cc02301a1c5bc3a2fb56fc91d4a1777ff686cc53cd7c40ce170"} Oct 10 07:48:47 crc kubenswrapper[4822]: I1010 07:48:47.838404 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"03a26b14-6f10-4b26-860c-41d14a2158ae","Type":"ContainerStarted","Data":"b5955353d730900320b8d6b360b85c1cc55dbcf99a0d59c91bcc81b1a0a1d080"} Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.219615 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.238267 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_03a26b14-6f10-4b26-860c-41d14a2158ae/mariadb-client/0.log" Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.279533 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.291915 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.296515 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5v9f\" (UniqueName: \"kubernetes.io/projected/03a26b14-6f10-4b26-860c-41d14a2158ae-kube-api-access-m5v9f\") pod \"03a26b14-6f10-4b26-860c-41d14a2158ae\" (UID: \"03a26b14-6f10-4b26-860c-41d14a2158ae\") " Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.301519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a26b14-6f10-4b26-860c-41d14a2158ae-kube-api-access-m5v9f" (OuterVolumeSpecName: "kube-api-access-m5v9f") pod "03a26b14-6f10-4b26-860c-41d14a2158ae" (UID: "03a26b14-6f10-4b26-860c-41d14a2158ae"). InnerVolumeSpecName "kube-api-access-m5v9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.399041 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5v9f\" (UniqueName: \"kubernetes.io/projected/03a26b14-6f10-4b26-860c-41d14a2158ae-kube-api-access-m5v9f\") on node \"crc\" DevicePath \"\"" Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.662919 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a26b14-6f10-4b26-860c-41d14a2158ae" path="/var/lib/kubelet/pods/03a26b14-6f10-4b26-860c-41d14a2158ae/volumes" Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.859632 4822 scope.go:117] "RemoveContainer" containerID="36a9fff97d6b5cc02301a1c5bc3a2fb56fc91d4a1777ff686cc53cd7c40ce170" Oct 10 07:48:49 crc kubenswrapper[4822]: I1010 07:48:49.859709 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 07:48:59 crc kubenswrapper[4822]: I1010 07:48:59.650908 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:48:59 crc kubenswrapper[4822]: E1010 07:48:59.652085 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:49:12 crc kubenswrapper[4822]: I1010 07:49:12.669343 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:49:12 crc kubenswrapper[4822]: E1010 07:49:12.670779 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:49:27 crc kubenswrapper[4822]: I1010 07:49:27.650624 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:49:27 crc kubenswrapper[4822]: E1010 07:49:27.652171 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:49:38 crc kubenswrapper[4822]: I1010 07:49:38.650324 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:49:38 crc kubenswrapper[4822]: E1010 07:49:38.651240 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:49:49 crc kubenswrapper[4822]: I1010 07:49:49.651216 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:49:49 crc kubenswrapper[4822]: E1010 07:49:49.652458 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:50:03 crc kubenswrapper[4822]: I1010 07:50:03.659256 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:50:03 crc kubenswrapper[4822]: E1010 07:50:03.660208 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:50:14 crc kubenswrapper[4822]: I1010 07:50:14.650614 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:50:14 crc kubenswrapper[4822]: E1010 07:50:14.651523 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.591027 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:50:19 crc kubenswrapper[4822]: E1010 07:50:19.592350 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a26b14-6f10-4b26-860c-41d14a2158ae" containerName="mariadb-client" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.592382 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a26b14-6f10-4b26-860c-41d14a2158ae" containerName="mariadb-client" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.592790 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a26b14-6f10-4b26-860c-41d14a2158ae" containerName="mariadb-client" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.594722 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.599006 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ws5sm" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.600043 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.601048 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.601348 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.603706 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.612678 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.615371 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.627167 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.647480 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.676734 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.695343 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmph\" (UniqueName: \"kubernetes.io/projected/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-kube-api-access-bhmph\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.695375 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.695398 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.695421 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.695615 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-config\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.696126 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.744492 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.747952 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.752088 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rmslr" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.752163 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.752440 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.766309 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.769559 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.776729 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.787374 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.789088 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798395 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ktff\" (UniqueName: \"kubernetes.io/projected/45235ece-2520-44af-bb2d-2083eaa25753-kube-api-access-2ktff\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798468 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798531 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45235ece-2520-44af-bb2d-2083eaa25753-config\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798580 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmph\" (UniqueName: \"kubernetes.io/projected/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-kube-api-access-bhmph\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798624 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798652 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45235ece-2520-44af-bb2d-2083eaa25753-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798677 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798704 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt5zn\" (UniqueName: \"kubernetes.io/projected/2239595c-2d41-4737-8496-a42a7215fdcc-kube-api-access-dt5zn\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798729 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.798773 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2239595c-2d41-4737-8496-a42a7215fdcc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799266 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799360 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799408 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45235ece-2520-44af-bb2d-2083eaa25753-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799436 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45235ece-2520-44af-bb2d-2083eaa25753-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799460 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2239595c-2d41-4737-8496-a42a7215fdcc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799509 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799546 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-config\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799588 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2239595c-2d41-4737-8496-a42a7215fdcc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.799653 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2239595c-2d41-4737-8496-a42a7215fdcc-config\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.801220 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-config\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.801858 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.804574 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.809076 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.809156 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ec11b329836f60f7bf9974dddb0db851fe76a1a4bd8476d86e8a06957cd38ba6/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.815074 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.820090 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmph\" (UniqueName: \"kubernetes.io/projected/92f35517-5623-47ab-b0b4-b9e9d9be4a9e-kube-api-access-bhmph\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.820949 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.866180 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d8a7ddd-27bd-45a5-8c6e-ac544176d4f3\") pod \"ovsdbserver-nb-2\" (UID: \"92f35517-5623-47ab-b0b4-b9e9d9be4a9e\") " pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900673 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5280bf6-7247-4221-a89b-b6a8a7c6a425-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900731 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c74b491-d30e-4d3a-8dc8-532b38928ef6-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900785 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmjm\" (UniqueName: \"kubernetes.io/projected/d5280bf6-7247-4221-a89b-b6a8a7c6a425-kube-api-access-kvmjm\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900878 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2239595c-2d41-4737-8496-a42a7215fdcc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900914 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900940 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900959 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.900980 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c74b491-d30e-4d3a-8dc8-532b38928ef6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901003 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45235ece-2520-44af-bb2d-2083eaa25753-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901020 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45235ece-2520-44af-bb2d-2083eaa25753-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901036 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2239595c-2d41-4737-8496-a42a7215fdcc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901051 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5280bf6-7247-4221-a89b-b6a8a7c6a425-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901068 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v97nk\" (UniqueName: \"kubernetes.io/projected/4a6a4a40-9317-4238-b106-7f2b1f900ad3-kube-api-access-v97nk\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901091 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901119 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqb4q\" (UniqueName: \"kubernetes.io/projected/8c74b491-d30e-4d3a-8dc8-532b38928ef6-kube-api-access-gqb4q\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901136 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2239595c-2d41-4737-8496-a42a7215fdcc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901156 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c74b491-d30e-4d3a-8dc8-532b38928ef6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901176 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a4a40-9317-4238-b106-7f2b1f900ad3-config\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901193 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c74b491-d30e-4d3a-8dc8-532b38928ef6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901210 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2239595c-2d41-4737-8496-a42a7215fdcc-config\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901234 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ktff\" (UniqueName: \"kubernetes.io/projected/45235ece-2520-44af-bb2d-2083eaa25753-kube-api-access-2ktff\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901249 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a6a4a40-9317-4238-b106-7f2b1f900ad3-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901266 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6a4a40-9317-4238-b106-7f2b1f900ad3-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901285 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6a4a40-9317-4238-b106-7f2b1f900ad3-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901307 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45235ece-2520-44af-bb2d-2083eaa25753-config\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901328 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5280bf6-7247-4221-a89b-b6a8a7c6a425-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901362 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45235ece-2520-44af-bb2d-2083eaa25753-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.901353 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2239595c-2d41-4737-8496-a42a7215fdcc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.902717 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45235ece-2520-44af-bb2d-2083eaa25753-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.903473 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2239595c-2d41-4737-8496-a42a7215fdcc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.903628 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2239595c-2d41-4737-8496-a42a7215fdcc-config\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.904478 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45235ece-2520-44af-bb2d-2083eaa25753-config\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.904547 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45235ece-2520-44af-bb2d-2083eaa25753-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.904675 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5280bf6-7247-4221-a89b-b6a8a7c6a425-config\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.904755 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt5zn\" (UniqueName: \"kubernetes.io/projected/2239595c-2d41-4737-8496-a42a7215fdcc-kube-api-access-dt5zn\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.904942 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.905782 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.905833 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ded36d6e71ff2addb5b310eb1b9400c3691e4548335fd69e715387ccbac1f5f7/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.905912 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.905944 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d0f63b508e3bb5da6d7bdb5106fba0b71ae53eb70b2e07360fbc64fb22c24df4/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.910777 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45235ece-2520-44af-bb2d-2083eaa25753-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.914744 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2239595c-2d41-4737-8496-a42a7215fdcc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.923579 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ktff\" (UniqueName: \"kubernetes.io/projected/45235ece-2520-44af-bb2d-2083eaa25753-kube-api-access-2ktff\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.924318 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt5zn\" (UniqueName: \"kubernetes.io/projected/2239595c-2d41-4737-8496-a42a7215fdcc-kube-api-access-dt5zn\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.946700 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe5178a4-5b39-4c0a-b6fd-76fc3dc91d1e\") pod \"ovsdbserver-nb-1\" (UID: \"45235ece-2520-44af-bb2d-2083eaa25753\") " pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.948653 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54651eaa-2403-4ce1-bbba-0c2b62773915\") pod \"ovsdbserver-nb-0\" (UID: \"2239595c-2d41-4737-8496-a42a7215fdcc\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.962824 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:19 crc kubenswrapper[4822]: I1010 07:50:19.973387 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.006978 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c74b491-d30e-4d3a-8dc8-532b38928ef6-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007024 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmjm\" (UniqueName: \"kubernetes.io/projected/d5280bf6-7247-4221-a89b-b6a8a7c6a425-kube-api-access-kvmjm\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007050 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007078 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007096 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c74b491-d30e-4d3a-8dc8-532b38928ef6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007124 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5280bf6-7247-4221-a89b-b6a8a7c6a425-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007141 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v97nk\" (UniqueName: \"kubernetes.io/projected/4a6a4a40-9317-4238-b106-7f2b1f900ad3-kube-api-access-v97nk\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007176 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqb4q\" (UniqueName: \"kubernetes.io/projected/8c74b491-d30e-4d3a-8dc8-532b38928ef6-kube-api-access-gqb4q\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007198 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c74b491-d30e-4d3a-8dc8-532b38928ef6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007218 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a4a40-9317-4238-b106-7f2b1f900ad3-config\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007234 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c74b491-d30e-4d3a-8dc8-532b38928ef6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007261 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a6a4a40-9317-4238-b106-7f2b1f900ad3-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007277 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6a4a40-9317-4238-b106-7f2b1f900ad3-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007299 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6a4a40-9317-4238-b106-7f2b1f900ad3-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007326 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5280bf6-7247-4221-a89b-b6a8a7c6a425-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007359 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5280bf6-7247-4221-a89b-b6a8a7c6a425-config\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007381 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.007400 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5280bf6-7247-4221-a89b-b6a8a7c6a425-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.008417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c74b491-d30e-4d3a-8dc8-532b38928ef6-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.009140 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c74b491-d30e-4d3a-8dc8-532b38928ef6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.009178 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5280bf6-7247-4221-a89b-b6a8a7c6a425-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.010147 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a4a40-9317-4238-b106-7f2b1f900ad3-config\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.010166 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c74b491-d30e-4d3a-8dc8-532b38928ef6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.010427 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a6a4a40-9317-4238-b106-7f2b1f900ad3-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.011184 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5280bf6-7247-4221-a89b-b6a8a7c6a425-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.011900 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.011934 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2b43bb64f615ee9d2b633197f7f7b3cd2413d6044c670f09725b96f4b635a35/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.012363 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5280bf6-7247-4221-a89b-b6a8a7c6a425-config\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.014061 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.014091 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/59cb802f189f34b7eca4714da1986b03e9742d153cc10221981b778cea841479/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.014712 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6a4a40-9317-4238-b106-7f2b1f900ad3-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.018741 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6a4a40-9317-4238-b106-7f2b1f900ad3-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.018920 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c74b491-d30e-4d3a-8dc8-532b38928ef6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.023362 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.023416 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9d5a4c74265d96ee952b48683451bea52fca6056499485c4b60ff6f8047d23d0/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.025897 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmjm\" (UniqueName: \"kubernetes.io/projected/d5280bf6-7247-4221-a89b-b6a8a7c6a425-kube-api-access-kvmjm\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.026598 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v97nk\" (UniqueName: \"kubernetes.io/projected/4a6a4a40-9317-4238-b106-7f2b1f900ad3-kube-api-access-v97nk\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.027717 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5280bf6-7247-4221-a89b-b6a8a7c6a425-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.028694 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqb4q\" (UniqueName: \"kubernetes.io/projected/8c74b491-d30e-4d3a-8dc8-532b38928ef6-kube-api-access-gqb4q\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.051220 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c04fe4a-e33f-4d46-9325-23d37aae1d95\") pod \"ovsdbserver-sb-0\" (UID: \"8c74b491-d30e-4d3a-8dc8-532b38928ef6\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.078106 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7dfd1b60-13b1-45c8-be98-7b62cde8f39d\") pod \"ovsdbserver-sb-1\" (UID: \"d5280bf6-7247-4221-a89b-b6a8a7c6a425\") " pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.078748 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.080784 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ef565ff6-4173-425d-91f0-cf22e75a19b8\") pod \"ovsdbserver-sb-2\" (UID: \"4a6a4a40-9317-4238-b106-7f2b1f900ad3\") " pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.095841 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.111123 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.244514 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.528498 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.615084 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.703083 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:50:20 crc kubenswrapper[4822]: W1010 07:50:20.707820 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2239595c_2d41_4737_8496_a42a7215fdcc.slice/crio-1e90e24067c22cbb0b23bd1c3d54c24c80306dc60721eb36da13f732267674b1 WatchSource:0}: Error finding container 1e90e24067c22cbb0b23bd1c3d54c24c80306dc60721eb36da13f732267674b1: Status 404 returned error can't find the container with id 1e90e24067c22cbb0b23bd1c3d54c24c80306dc60721eb36da13f732267674b1 Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.789926 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"92f35517-5623-47ab-b0b4-b9e9d9be4a9e","Type":"ContainerStarted","Data":"b96195bda7f7e029ee293b19d12cadb15262929e447a9669cdc1186cb2c02039"} Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.792468 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2239595c-2d41-4737-8496-a42a7215fdcc","Type":"ContainerStarted","Data":"1e90e24067c22cbb0b23bd1c3d54c24c80306dc60721eb36da13f732267674b1"} Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.794556 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"45235ece-2520-44af-bb2d-2083eaa25753","Type":"ContainerStarted","Data":"71c0601bb65755bed359600564dd2d49644f88c6bf93d48c5dfc3f39a1bf3ccf"} Oct 10 07:50:20 crc kubenswrapper[4822]: I1010 07:50:20.794652 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"45235ece-2520-44af-bb2d-2083eaa25753","Type":"ContainerStarted","Data":"aaad190dfe1354f79bbb3d19569569e1b569c326bcde9547a88aee303c36c289"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.410052 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.676257 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 10 07:50:21 crc kubenswrapper[4822]: W1010 07:50:21.677155 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6a4a40_9317_4238_b106_7f2b1f900ad3.slice/crio-43df724a302bba87fa01b1539cd6d02ff826027abfb88fde24a35f17fc0a7eb3 WatchSource:0}: Error finding container 43df724a302bba87fa01b1539cd6d02ff826027abfb88fde24a35f17fc0a7eb3: Status 404 returned error can't find the container with id 43df724a302bba87fa01b1539cd6d02ff826027abfb88fde24a35f17fc0a7eb3 Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.806324 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"45235ece-2520-44af-bb2d-2083eaa25753","Type":"ContainerStarted","Data":"c5d7f91207d8f673488333a5211d6f3cabf5229b100233ed06b11e780da9dce9"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.809688 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d5280bf6-7247-4221-a89b-b6a8a7c6a425","Type":"ContainerStarted","Data":"c45490d8086eee6b90014df82cbdb479e55fb984968d0b1b022441aaa38cbc95"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.809719 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d5280bf6-7247-4221-a89b-b6a8a7c6a425","Type":"ContainerStarted","Data":"dc6b539c7557e9f09acd168de4f49a6532d36943f64d621f580dca6132ce45e0"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.809728 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d5280bf6-7247-4221-a89b-b6a8a7c6a425","Type":"ContainerStarted","Data":"29cca0fb0c779c24d6cad69ea8298179e64e5d27b9981f3053c79ac98e1d94c3"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.811604 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"92f35517-5623-47ab-b0b4-b9e9d9be4a9e","Type":"ContainerStarted","Data":"e63169609de2b99d16f38da96c0ac460ca0c880ac3ce89b2b9f835c59f4dc356"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.811671 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"92f35517-5623-47ab-b0b4-b9e9d9be4a9e","Type":"ContainerStarted","Data":"f202cf89f54f19bc586664e6658bfbc37ccc191bd19afbf451271f719f487dd2"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.813833 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"4a6a4a40-9317-4238-b106-7f2b1f900ad3","Type":"ContainerStarted","Data":"43df724a302bba87fa01b1539cd6d02ff826027abfb88fde24a35f17fc0a7eb3"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.817001 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2239595c-2d41-4737-8496-a42a7215fdcc","Type":"ContainerStarted","Data":"d4ab57e72da74a61bc92e6a7341713814bfb3f72dfb7dcbcc41a0c910b6e5014"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.817031 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2239595c-2d41-4737-8496-a42a7215fdcc","Type":"ContainerStarted","Data":"9c7e790a0cfd75139c9330016155569660c46e44ee8abc3b2e883c22690d9ac5"} Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.833261 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.833239441 podStartE2EDuration="3.833239441s" podCreationTimestamp="2025-10-10 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:21.827059503 +0000 UTC m=+5168.922217709" watchObservedRunningTime="2025-10-10 07:50:21.833239441 +0000 UTC m=+5168.928397637" Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.847415 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.8473907389999997 podStartE2EDuration="3.847390739s" podCreationTimestamp="2025-10-10 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:21.845236357 +0000 UTC m=+5168.940394593" watchObservedRunningTime="2025-10-10 07:50:21.847390739 +0000 UTC m=+5168.942548975" Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.867669 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.867644083 podStartE2EDuration="3.867644083s" podCreationTimestamp="2025-10-10 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:21.867636093 +0000 UTC m=+5168.962794369" watchObservedRunningTime="2025-10-10 07:50:21.867644083 +0000 UTC m=+5168.962802289" Oct 10 07:50:21 crc kubenswrapper[4822]: I1010 07:50:21.890101 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.89008049 podStartE2EDuration="3.89008049s" podCreationTimestamp="2025-10-10 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:21.885111397 +0000 UTC m=+5168.980269593" watchObservedRunningTime="2025-10-10 07:50:21.89008049 +0000 UTC m=+5168.985238686" Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.574029 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.826628 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"4a6a4a40-9317-4238-b106-7f2b1f900ad3","Type":"ContainerStarted","Data":"0e6278935ae185fc06f151a810068160c07cf21f71b4c8eef88fd4711c934073"} Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.826675 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"4a6a4a40-9317-4238-b106-7f2b1f900ad3","Type":"ContainerStarted","Data":"da15e16755ab91eeab03443d06101598427a11a87aa0888d78189ce8984bb984"} Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.839054 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c74b491-d30e-4d3a-8dc8-532b38928ef6","Type":"ContainerStarted","Data":"ab8dfbbb1705f1d674d63f3e0cb724c127def9f45e542d054bea11ea2496e3c8"} Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.839101 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c74b491-d30e-4d3a-8dc8-532b38928ef6","Type":"ContainerStarted","Data":"8ef66dd351d86205f9e31b366505c50537a4a30565c5ecdf9de51a0616c79d41"} Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.857216 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.857190103 podStartE2EDuration="4.857190103s" podCreationTimestamp="2025-10-10 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:22.846468624 +0000 UTC m=+5169.941626830" watchObservedRunningTime="2025-10-10 07:50:22.857190103 +0000 UTC m=+5169.952348299" Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.964060 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:22 crc kubenswrapper[4822]: I1010 07:50:22.973430 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.007291 4822 scope.go:117] "RemoveContainer" containerID="8ae9c0019aa5d20a2316264924237ae7f7641162b70f36ec35daf6193e73bf14" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.031703 4822 scope.go:117] "RemoveContainer" containerID="e9f7058d53ab00c95779d74c02dd026312193e81c8ba0606a55dae7efdb833ec" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.055078 4822 scope.go:117] "RemoveContainer" containerID="0d5a761e2c52bf6d8078780afe6ae515e274759c6d54b37bd91966d62e886d41" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.087781 4822 scope.go:117] "RemoveContainer" containerID="e9b2ac8cc4e0b990dd7fa5e62af3327420a07ac93c87c46a75698fed681f0998" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.096729 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.111865 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.119519 4822 scope.go:117] "RemoveContainer" containerID="47d279f999265c83ca04ecafe09a0c894a410f3fd2521090f70517562a68f9a6" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.247191 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.305023 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.848981 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c74b491-d30e-4d3a-8dc8-532b38928ef6","Type":"ContainerStarted","Data":"9667c7b09a41fd0eff25b3f84f75f22797e5d3dff8becef32e2cb8da2195ac12"} Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.850218 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:23 crc kubenswrapper[4822]: I1010 07:50:23.878143 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.878122208 podStartE2EDuration="5.878122208s" podCreationTimestamp="2025-10-10 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:23.872541057 +0000 UTC m=+5170.967699273" watchObservedRunningTime="2025-10-10 07:50:23.878122208 +0000 UTC m=+5170.973280404" Oct 10 07:50:24 crc kubenswrapper[4822]: I1010 07:50:24.964428 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:24 crc kubenswrapper[4822]: I1010 07:50:24.974160 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.079878 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.096469 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.112003 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.295402 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.575009 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b94f8f99-pnppc"] Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.576887 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.579307 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.591272 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b94f8f99-pnppc"] Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.715662 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-dns-svc\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.716045 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslpn\" (UniqueName: \"kubernetes.io/projected/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-kube-api-access-mslpn\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.716129 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-config\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.716162 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.817580 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-config\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.817674 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.817761 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-dns-svc\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.817887 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mslpn\" (UniqueName: \"kubernetes.io/projected/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-kube-api-access-mslpn\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.818584 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-config\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.818608 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.819303 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-dns-svc\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.845158 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslpn\" (UniqueName: \"kubernetes.io/projected/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-kube-api-access-mslpn\") pod \"dnsmasq-dns-5b94f8f99-pnppc\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:25 crc kubenswrapper[4822]: I1010 07:50:25.895789 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.022621 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.033623 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.080015 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.092125 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.096444 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.127464 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.157430 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.162713 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.199335 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.395305 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b94f8f99-pnppc"] Oct 10 07:50:26 crc kubenswrapper[4822]: W1010 07:50:26.411163 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e7a5d6a_dee0_4685_8205_86f5c0d2f34d.slice/crio-cdab3186a09cd4a22d0c79c6fd0705f8028dc82788befe2e575bd14e38ef759a WatchSource:0}: Error finding container cdab3186a09cd4a22d0c79c6fd0705f8028dc82788befe2e575bd14e38ef759a: Status 404 returned error can't find the container with id cdab3186a09cd4a22d0c79c6fd0705f8028dc82788befe2e575bd14e38ef759a Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.522837 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b94f8f99-pnppc"] Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.555914 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bfbd688f-x4kpx"] Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.557453 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.561770 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.588398 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bfbd688f-x4kpx"] Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.737855 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-dns-svc\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.738060 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-sb\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.738119 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-nb\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.738294 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msgzh\" (UniqueName: \"kubernetes.io/projected/f0c5450b-1686-49f2-abd2-22a6199e218b-kube-api-access-msgzh\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.738433 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-config\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.840031 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-sb\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.840098 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-nb\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.840167 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msgzh\" (UniqueName: \"kubernetes.io/projected/f0c5450b-1686-49f2-abd2-22a6199e218b-kube-api-access-msgzh\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.840236 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-config\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.840305 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-dns-svc\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.841537 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-config\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.841894 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-dns-svc\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.841992 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-sb\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.842715 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-nb\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.874466 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msgzh\" (UniqueName: \"kubernetes.io/projected/f0c5450b-1686-49f2-abd2-22a6199e218b-kube-api-access-msgzh\") pod \"dnsmasq-dns-86bfbd688f-x4kpx\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.880199 4822 generic.go:334] "Generic (PLEG): container finished" podID="7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" containerID="fe05aa7d689fc586ba9d3a1ba6983bc4ea1c4f1490f5f191a1842205052c066d" exitCode=0 Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.881961 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" event={"ID":"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d","Type":"ContainerDied","Data":"fe05aa7d689fc586ba9d3a1ba6983bc4ea1c4f1490f5f191a1842205052c066d"} Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.882003 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" event={"ID":"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d","Type":"ContainerStarted","Data":"cdab3186a09cd4a22d0c79c6fd0705f8028dc82788befe2e575bd14e38ef759a"} Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.893227 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:26 crc kubenswrapper[4822]: I1010 07:50:26.950004 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.191226 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.250137 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-ovsdbserver-nb\") pod \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.250224 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-dns-svc\") pod \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.250322 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-config\") pod \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.250406 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mslpn\" (UniqueName: \"kubernetes.io/projected/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-kube-api-access-mslpn\") pod \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\" (UID: \"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d\") " Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.255380 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-kube-api-access-mslpn" (OuterVolumeSpecName: "kube-api-access-mslpn") pod "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" (UID: "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d"). InnerVolumeSpecName "kube-api-access-mslpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.269300 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" (UID: "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.269349 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-config" (OuterVolumeSpecName: "config") pod "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" (UID: "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.269354 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" (UID: "7e7a5d6a-dee0-4685-8205-86f5c0d2f34d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.352107 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.352136 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mslpn\" (UniqueName: \"kubernetes.io/projected/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-kube-api-access-mslpn\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.352151 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.352159 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.405123 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bfbd688f-x4kpx"] Oct 10 07:50:27 crc kubenswrapper[4822]: W1010 07:50:27.408458 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0c5450b_1686_49f2_abd2_22a6199e218b.slice/crio-384b1f1bebe15c5ae04ce96ca78f50afd147f2f52d4407f73402ead559c4df40 WatchSource:0}: Error finding container 384b1f1bebe15c5ae04ce96ca78f50afd147f2f52d4407f73402ead559c4df40: Status 404 returned error can't find the container with id 384b1f1bebe15c5ae04ce96ca78f50afd147f2f52d4407f73402ead559c4df40 Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.891154 4822 generic.go:334] "Generic (PLEG): container finished" podID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerID="db680deb468d8342ad9710cb93cb05ce7366b21663f39ca6cb8498a28e1f007e" exitCode=0 Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.891220 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" event={"ID":"f0c5450b-1686-49f2-abd2-22a6199e218b","Type":"ContainerDied","Data":"db680deb468d8342ad9710cb93cb05ce7366b21663f39ca6cb8498a28e1f007e"} Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.891269 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" event={"ID":"f0c5450b-1686-49f2-abd2-22a6199e218b","Type":"ContainerStarted","Data":"384b1f1bebe15c5ae04ce96ca78f50afd147f2f52d4407f73402ead559c4df40"} Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.894501 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" event={"ID":"7e7a5d6a-dee0-4685-8205-86f5c0d2f34d","Type":"ContainerDied","Data":"cdab3186a09cd4a22d0c79c6fd0705f8028dc82788befe2e575bd14e38ef759a"} Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.894589 4822 scope.go:117] "RemoveContainer" containerID="fe05aa7d689fc586ba9d3a1ba6983bc4ea1c4f1490f5f191a1842205052c066d" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.895273 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b94f8f99-pnppc" Oct 10 07:50:27 crc kubenswrapper[4822]: I1010 07:50:27.967422 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 10 07:50:28 crc kubenswrapper[4822]: I1010 07:50:28.144038 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b94f8f99-pnppc"] Oct 10 07:50:28 crc kubenswrapper[4822]: I1010 07:50:28.153334 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b94f8f99-pnppc"] Oct 10 07:50:28 crc kubenswrapper[4822]: I1010 07:50:28.650097 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:50:28 crc kubenswrapper[4822]: E1010 07:50:28.650336 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:50:28 crc kubenswrapper[4822]: I1010 07:50:28.904767 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" event={"ID":"f0c5450b-1686-49f2-abd2-22a6199e218b","Type":"ContainerStarted","Data":"a671c82969259ce74e8ff03edfed8a217d91a3ca469284a9ca759d669a88c8a7"} Oct 10 07:50:28 crc kubenswrapper[4822]: I1010 07:50:28.925835 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" podStartSLOduration=2.925813763 podStartE2EDuration="2.925813763s" podCreationTimestamp="2025-10-10 07:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:28.922300071 +0000 UTC m=+5176.017458267" watchObservedRunningTime="2025-10-10 07:50:28.925813763 +0000 UTC m=+5176.020971969" Oct 10 07:50:29 crc kubenswrapper[4822]: I1010 07:50:29.678988 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" path="/var/lib/kubelet/pods/7e7a5d6a-dee0-4685-8205-86f5c0d2f34d/volumes" Oct 10 07:50:29 crc kubenswrapper[4822]: I1010 07:50:29.914180 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.632745 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 10 07:50:30 crc kubenswrapper[4822]: E1010 07:50:30.633179 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" containerName="init" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.633205 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" containerName="init" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.633406 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7a5d6a-dee0-4685-8205-86f5c0d2f34d" containerName="init" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.634113 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.638038 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.652475 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.721210 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.721720 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6a6d3351-99a2-43e0-94a7-9610fe9186eb-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.721897 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gln77\" (UniqueName: \"kubernetes.io/projected/6a6d3351-99a2-43e0-94a7-9610fe9186eb-kube-api-access-gln77\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.823638 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.823956 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6a6d3351-99a2-43e0-94a7-9610fe9186eb-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.824112 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gln77\" (UniqueName: \"kubernetes.io/projected/6a6d3351-99a2-43e0-94a7-9610fe9186eb-kube-api-access-gln77\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.834955 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.834992 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/411c3f85fc8bb7e3cdf978b8b2948d8099b8789776a501d78e53dc90583e06bf/globalmount\"" pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.841159 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gln77\" (UniqueName: \"kubernetes.io/projected/6a6d3351-99a2-43e0-94a7-9610fe9186eb-kube-api-access-gln77\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.841267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6a6d3351-99a2-43e0-94a7-9610fe9186eb-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.860281 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") pod \"ovn-copy-data\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " pod="openstack/ovn-copy-data" Oct 10 07:50:30 crc kubenswrapper[4822]: I1010 07:50:30.952511 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 07:50:31 crc kubenswrapper[4822]: I1010 07:50:31.469997 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 07:50:31 crc kubenswrapper[4822]: W1010 07:50:31.478904 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a6d3351_99a2_43e0_94a7_9610fe9186eb.slice/crio-3e7ce490735967f14de67340df57fea61d6c4495e78ea53eb0431784935ce1b3 WatchSource:0}: Error finding container 3e7ce490735967f14de67340df57fea61d6c4495e78ea53eb0431784935ce1b3: Status 404 returned error can't find the container with id 3e7ce490735967f14de67340df57fea61d6c4495e78ea53eb0431784935ce1b3 Oct 10 07:50:31 crc kubenswrapper[4822]: I1010 07:50:31.930001 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6a6d3351-99a2-43e0-94a7-9610fe9186eb","Type":"ContainerStarted","Data":"4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df"} Oct 10 07:50:31 crc kubenswrapper[4822]: I1010 07:50:31.930367 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6a6d3351-99a2-43e0-94a7-9610fe9186eb","Type":"ContainerStarted","Data":"3e7ce490735967f14de67340df57fea61d6c4495e78ea53eb0431784935ce1b3"} Oct 10 07:50:31 crc kubenswrapper[4822]: I1010 07:50:31.950366 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.950343695 podStartE2EDuration="2.950343695s" podCreationTimestamp="2025-10-10 07:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:31.947101182 +0000 UTC m=+5179.042259398" watchObservedRunningTime="2025-10-10 07:50:31.950343695 +0000 UTC m=+5179.045501891" Oct 10 07:50:36 crc kubenswrapper[4822]: I1010 07:50:36.895244 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:50:36 crc kubenswrapper[4822]: I1010 07:50:36.994163 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-tdw2q"] Oct 10 07:50:36 crc kubenswrapper[4822]: I1010 07:50:36.994530 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="dnsmasq-dns" containerID="cri-o://77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070" gracePeriod=10 Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.165512 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.245:5353: connect: connection refused" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.402427 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.404864 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.406952 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.407189 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pndss" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.410723 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.427536 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.535432 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.549552 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/438d0867-b9d5-4195-bff2-6f70a239db66-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.549883 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9cgn\" (UniqueName: \"kubernetes.io/projected/438d0867-b9d5-4195-bff2-6f70a239db66-kube-api-access-s9cgn\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.550050 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/438d0867-b9d5-4195-bff2-6f70a239db66-config\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.550163 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/438d0867-b9d5-4195-bff2-6f70a239db66-scripts\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.550340 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438d0867-b9d5-4195-bff2-6f70a239db66-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.650708 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-config\") pod \"6ea69b87-1caf-4ae2-9779-7695ce42f965\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.651054 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzqdl\" (UniqueName: \"kubernetes.io/projected/6ea69b87-1caf-4ae2-9779-7695ce42f965-kube-api-access-mzqdl\") pod \"6ea69b87-1caf-4ae2-9779-7695ce42f965\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.651152 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-dns-svc\") pod \"6ea69b87-1caf-4ae2-9779-7695ce42f965\" (UID: \"6ea69b87-1caf-4ae2-9779-7695ce42f965\") " Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.651745 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/438d0867-b9d5-4195-bff2-6f70a239db66-config\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.651870 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/438d0867-b9d5-4195-bff2-6f70a239db66-scripts\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.651987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438d0867-b9d5-4195-bff2-6f70a239db66-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.652127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/438d0867-b9d5-4195-bff2-6f70a239db66-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.652203 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9cgn\" (UniqueName: \"kubernetes.io/projected/438d0867-b9d5-4195-bff2-6f70a239db66-kube-api-access-s9cgn\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.652681 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/438d0867-b9d5-4195-bff2-6f70a239db66-config\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.652752 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/438d0867-b9d5-4195-bff2-6f70a239db66-scripts\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.653012 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/438d0867-b9d5-4195-bff2-6f70a239db66-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.657928 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438d0867-b9d5-4195-bff2-6f70a239db66-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.666068 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea69b87-1caf-4ae2-9779-7695ce42f965-kube-api-access-mzqdl" (OuterVolumeSpecName: "kube-api-access-mzqdl") pod "6ea69b87-1caf-4ae2-9779-7695ce42f965" (UID: "6ea69b87-1caf-4ae2-9779-7695ce42f965"). InnerVolumeSpecName "kube-api-access-mzqdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.667912 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9cgn\" (UniqueName: \"kubernetes.io/projected/438d0867-b9d5-4195-bff2-6f70a239db66-kube-api-access-s9cgn\") pod \"ovn-northd-0\" (UID: \"438d0867-b9d5-4195-bff2-6f70a239db66\") " pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.699532 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ea69b87-1caf-4ae2-9779-7695ce42f965" (UID: "6ea69b87-1caf-4ae2-9779-7695ce42f965"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.721485 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-config" (OuterVolumeSpecName: "config") pod "6ea69b87-1caf-4ae2-9779-7695ce42f965" (UID: "6ea69b87-1caf-4ae2-9779-7695ce42f965"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.736869 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.753386 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.753409 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzqdl\" (UniqueName: \"kubernetes.io/projected/6ea69b87-1caf-4ae2-9779-7695ce42f965-kube-api-access-mzqdl\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.753420 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ea69b87-1caf-4ae2-9779-7695ce42f965-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.999183 4822 generic.go:334] "Generic (PLEG): container finished" podID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerID="77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070" exitCode=0 Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.999258 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" event={"ID":"6ea69b87-1caf-4ae2-9779-7695ce42f965","Type":"ContainerDied","Data":"77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070"} Oct 10 07:50:37 crc kubenswrapper[4822]: I1010 07:50:37.999293 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:37.999589 4822 scope.go:117] "RemoveContainer" containerID="77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:37.999570 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-tdw2q" event={"ID":"6ea69b87-1caf-4ae2-9779-7695ce42f965","Type":"ContainerDied","Data":"8caae1610af659423fe11a9e750b3bac5b7af93e299dd50b953aafac5130e515"} Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.019331 4822 scope.go:117] "RemoveContainer" containerID="06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.036793 4822 scope.go:117] "RemoveContainer" containerID="77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070" Oct 10 07:50:38 crc kubenswrapper[4822]: E1010 07:50:38.038471 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070\": container with ID starting with 77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070 not found: ID does not exist" containerID="77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.038507 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070"} err="failed to get container status \"77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070\": rpc error: code = NotFound desc = could not find container \"77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070\": container with ID starting with 77b3305fd43674b975264ae13459186a99eed7f071df96a181f912cc5e121070 not found: ID does not exist" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.038604 4822 scope.go:117] "RemoveContainer" containerID="06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886" Oct 10 07:50:38 crc kubenswrapper[4822]: E1010 07:50:38.038872 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886\": container with ID starting with 06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886 not found: ID does not exist" containerID="06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.038895 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886"} err="failed to get container status \"06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886\": rpc error: code = NotFound desc = could not find container \"06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886\": container with ID starting with 06bd52ecab6330fbe3951fb3031f3ecef0e886b351b0a8c783dd577163256886 not found: ID does not exist" Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.043314 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-tdw2q"] Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.049874 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-tdw2q"] Oct 10 07:50:38 crc kubenswrapper[4822]: I1010 07:50:38.168122 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:50:38 crc kubenswrapper[4822]: W1010 07:50:38.181605 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438d0867_b9d5_4195_bff2_6f70a239db66.slice/crio-0524c9ab1713f845fbf56c91d9b5a0a43b07b4b3e1f5698d25d202b40e49692b WatchSource:0}: Error finding container 0524c9ab1713f845fbf56c91d9b5a0a43b07b4b3e1f5698d25d202b40e49692b: Status 404 returned error can't find the container with id 0524c9ab1713f845fbf56c91d9b5a0a43b07b4b3e1f5698d25d202b40e49692b Oct 10 07:50:39 crc kubenswrapper[4822]: I1010 07:50:39.013520 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"438d0867-b9d5-4195-bff2-6f70a239db66","Type":"ContainerStarted","Data":"2e735f9f2d7d7d1d82e469edfeb41e80b64e091d2aa5eb0df8525dde2d9bef61"} Oct 10 07:50:39 crc kubenswrapper[4822]: I1010 07:50:39.014098 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"438d0867-b9d5-4195-bff2-6f70a239db66","Type":"ContainerStarted","Data":"dc2a4331aa511cb915576e7369c19745a5d0f4c367b81b22e6a2a60d045c4a0e"} Oct 10 07:50:39 crc kubenswrapper[4822]: I1010 07:50:39.014114 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"438d0867-b9d5-4195-bff2-6f70a239db66","Type":"ContainerStarted","Data":"0524c9ab1713f845fbf56c91d9b5a0a43b07b4b3e1f5698d25d202b40e49692b"} Oct 10 07:50:39 crc kubenswrapper[4822]: I1010 07:50:39.014527 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 10 07:50:39 crc kubenswrapper[4822]: I1010 07:50:39.043179 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.043151383 podStartE2EDuration="2.043151383s" podCreationTimestamp="2025-10-10 07:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:39.034509274 +0000 UTC m=+5186.129667480" watchObservedRunningTime="2025-10-10 07:50:39.043151383 +0000 UTC m=+5186.138309569" Oct 10 07:50:39 crc kubenswrapper[4822]: I1010 07:50:39.672546 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" path="/var/lib/kubelet/pods/6ea69b87-1caf-4ae2-9779-7695ce42f965/volumes" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.685861 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-l6m4k"] Oct 10 07:50:42 crc kubenswrapper[4822]: E1010 07:50:42.686564 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="dnsmasq-dns" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.686580 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="dnsmasq-dns" Oct 10 07:50:42 crc kubenswrapper[4822]: E1010 07:50:42.686610 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="init" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.686620 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="init" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.686790 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea69b87-1caf-4ae2-9779-7695ce42f965" containerName="dnsmasq-dns" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.687727 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.697198 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l6m4k"] Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.738039 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96h25\" (UniqueName: \"kubernetes.io/projected/1b8dcb89-06cd-4756-9558-6181e3c25ee3-kube-api-access-96h25\") pod \"keystone-db-create-l6m4k\" (UID: \"1b8dcb89-06cd-4756-9558-6181e3c25ee3\") " pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.840614 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96h25\" (UniqueName: \"kubernetes.io/projected/1b8dcb89-06cd-4756-9558-6181e3c25ee3-kube-api-access-96h25\") pod \"keystone-db-create-l6m4k\" (UID: \"1b8dcb89-06cd-4756-9558-6181e3c25ee3\") " pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:42 crc kubenswrapper[4822]: I1010 07:50:42.862467 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96h25\" (UniqueName: \"kubernetes.io/projected/1b8dcb89-06cd-4756-9558-6181e3c25ee3-kube-api-access-96h25\") pod \"keystone-db-create-l6m4k\" (UID: \"1b8dcb89-06cd-4756-9558-6181e3c25ee3\") " pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:43 crc kubenswrapper[4822]: I1010 07:50:43.007307 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:43 crc kubenswrapper[4822]: I1010 07:50:43.475432 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l6m4k"] Oct 10 07:50:43 crc kubenswrapper[4822]: I1010 07:50:43.663757 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:50:43 crc kubenswrapper[4822]: E1010 07:50:43.664347 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:50:44 crc kubenswrapper[4822]: I1010 07:50:44.057886 4822 generic.go:334] "Generic (PLEG): container finished" podID="1b8dcb89-06cd-4756-9558-6181e3c25ee3" containerID="61be0a6d5c7c2849ed62d86a0536b615c6874a3718b9c16c5d297be07f10fe99" exitCode=0 Oct 10 07:50:44 crc kubenswrapper[4822]: I1010 07:50:44.057956 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6m4k" event={"ID":"1b8dcb89-06cd-4756-9558-6181e3c25ee3","Type":"ContainerDied","Data":"61be0a6d5c7c2849ed62d86a0536b615c6874a3718b9c16c5d297be07f10fe99"} Oct 10 07:50:44 crc kubenswrapper[4822]: I1010 07:50:44.058377 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6m4k" event={"ID":"1b8dcb89-06cd-4756-9558-6181e3c25ee3","Type":"ContainerStarted","Data":"d696a7b50d764f2c6522b051b2e48ce7aa1331aeee17096602531a8e65653ec1"} Oct 10 07:50:45 crc kubenswrapper[4822]: I1010 07:50:45.516450 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:45 crc kubenswrapper[4822]: I1010 07:50:45.602926 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96h25\" (UniqueName: \"kubernetes.io/projected/1b8dcb89-06cd-4756-9558-6181e3c25ee3-kube-api-access-96h25\") pod \"1b8dcb89-06cd-4756-9558-6181e3c25ee3\" (UID: \"1b8dcb89-06cd-4756-9558-6181e3c25ee3\") " Oct 10 07:50:45 crc kubenswrapper[4822]: I1010 07:50:45.609043 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8dcb89-06cd-4756-9558-6181e3c25ee3-kube-api-access-96h25" (OuterVolumeSpecName: "kube-api-access-96h25") pod "1b8dcb89-06cd-4756-9558-6181e3c25ee3" (UID: "1b8dcb89-06cd-4756-9558-6181e3c25ee3"). InnerVolumeSpecName "kube-api-access-96h25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:50:45 crc kubenswrapper[4822]: I1010 07:50:45.705226 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96h25\" (UniqueName: \"kubernetes.io/projected/1b8dcb89-06cd-4756-9558-6181e3c25ee3-kube-api-access-96h25\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:46 crc kubenswrapper[4822]: I1010 07:50:46.083774 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l6m4k" event={"ID":"1b8dcb89-06cd-4756-9558-6181e3c25ee3","Type":"ContainerDied","Data":"d696a7b50d764f2c6522b051b2e48ce7aa1331aeee17096602531a8e65653ec1"} Oct 10 07:50:46 crc kubenswrapper[4822]: I1010 07:50:46.083882 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d696a7b50d764f2c6522b051b2e48ce7aa1331aeee17096602531a8e65653ec1" Oct 10 07:50:46 crc kubenswrapper[4822]: I1010 07:50:46.083919 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l6m4k" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.801649 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bd67-account-create-d4j8r"] Oct 10 07:50:52 crc kubenswrapper[4822]: E1010 07:50:52.802522 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8dcb89-06cd-4756-9558-6181e3c25ee3" containerName="mariadb-database-create" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.802536 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8dcb89-06cd-4756-9558-6181e3c25ee3" containerName="mariadb-database-create" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.802729 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8dcb89-06cd-4756-9558-6181e3c25ee3" containerName="mariadb-database-create" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.803362 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.805348 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.811230 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bd67-account-create-d4j8r"] Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.839417 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnnqw\" (UniqueName: \"kubernetes.io/projected/4d2a846e-642f-4748-ae24-79441a7c078e-kube-api-access-hnnqw\") pod \"keystone-bd67-account-create-d4j8r\" (UID: \"4d2a846e-642f-4748-ae24-79441a7c078e\") " pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.848727 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.941329 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnnqw\" (UniqueName: \"kubernetes.io/projected/4d2a846e-642f-4748-ae24-79441a7c078e-kube-api-access-hnnqw\") pod \"keystone-bd67-account-create-d4j8r\" (UID: \"4d2a846e-642f-4748-ae24-79441a7c078e\") " pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:52 crc kubenswrapper[4822]: I1010 07:50:52.962253 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnnqw\" (UniqueName: \"kubernetes.io/projected/4d2a846e-642f-4748-ae24-79441a7c078e-kube-api-access-hnnqw\") pod \"keystone-bd67-account-create-d4j8r\" (UID: \"4d2a846e-642f-4748-ae24-79441a7c078e\") " pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:53 crc kubenswrapper[4822]: I1010 07:50:53.135294 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:53 crc kubenswrapper[4822]: I1010 07:50:53.570190 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bd67-account-create-d4j8r"] Oct 10 07:50:53 crc kubenswrapper[4822]: W1010 07:50:53.577945 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d2a846e_642f_4748_ae24_79441a7c078e.slice/crio-35ada19ed58ca27a79387efef517c3b86ba5b752ec2861ee8241b5beb409df4b WatchSource:0}: Error finding container 35ada19ed58ca27a79387efef517c3b86ba5b752ec2861ee8241b5beb409df4b: Status 404 returned error can't find the container with id 35ada19ed58ca27a79387efef517c3b86ba5b752ec2861ee8241b5beb409df4b Oct 10 07:50:54 crc kubenswrapper[4822]: I1010 07:50:54.181771 4822 generic.go:334] "Generic (PLEG): container finished" podID="4d2a846e-642f-4748-ae24-79441a7c078e" containerID="af651a343b409b0cf18c4904717d79f766a6c2b2f12e7c2eb0ac609e897e56f0" exitCode=0 Oct 10 07:50:54 crc kubenswrapper[4822]: I1010 07:50:54.182032 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd67-account-create-d4j8r" event={"ID":"4d2a846e-642f-4748-ae24-79441a7c078e","Type":"ContainerDied","Data":"af651a343b409b0cf18c4904717d79f766a6c2b2f12e7c2eb0ac609e897e56f0"} Oct 10 07:50:54 crc kubenswrapper[4822]: I1010 07:50:54.182191 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd67-account-create-d4j8r" event={"ID":"4d2a846e-642f-4748-ae24-79441a7c078e","Type":"ContainerStarted","Data":"35ada19ed58ca27a79387efef517c3b86ba5b752ec2861ee8241b5beb409df4b"} Oct 10 07:50:55 crc kubenswrapper[4822]: I1010 07:50:55.582370 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:55 crc kubenswrapper[4822]: I1010 07:50:55.686471 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnnqw\" (UniqueName: \"kubernetes.io/projected/4d2a846e-642f-4748-ae24-79441a7c078e-kube-api-access-hnnqw\") pod \"4d2a846e-642f-4748-ae24-79441a7c078e\" (UID: \"4d2a846e-642f-4748-ae24-79441a7c078e\") " Oct 10 07:50:55 crc kubenswrapper[4822]: I1010 07:50:55.694930 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2a846e-642f-4748-ae24-79441a7c078e-kube-api-access-hnnqw" (OuterVolumeSpecName: "kube-api-access-hnnqw") pod "4d2a846e-642f-4748-ae24-79441a7c078e" (UID: "4d2a846e-642f-4748-ae24-79441a7c078e"). InnerVolumeSpecName "kube-api-access-hnnqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:50:55 crc kubenswrapper[4822]: I1010 07:50:55.788551 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnnqw\" (UniqueName: \"kubernetes.io/projected/4d2a846e-642f-4748-ae24-79441a7c078e-kube-api-access-hnnqw\") on node \"crc\" DevicePath \"\"" Oct 10 07:50:56 crc kubenswrapper[4822]: I1010 07:50:56.207184 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd67-account-create-d4j8r" event={"ID":"4d2a846e-642f-4748-ae24-79441a7c078e","Type":"ContainerDied","Data":"35ada19ed58ca27a79387efef517c3b86ba5b752ec2861ee8241b5beb409df4b"} Oct 10 07:50:56 crc kubenswrapper[4822]: I1010 07:50:56.207607 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35ada19ed58ca27a79387efef517c3b86ba5b752ec2861ee8241b5beb409df4b" Oct 10 07:50:56 crc kubenswrapper[4822]: I1010 07:50:56.207273 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd67-account-create-d4j8r" Oct 10 07:50:56 crc kubenswrapper[4822]: I1010 07:50:56.650686 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:50:56 crc kubenswrapper[4822]: E1010 07:50:56.651076 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.273789 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-z4slf"] Oct 10 07:50:58 crc kubenswrapper[4822]: E1010 07:50:58.274319 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2a846e-642f-4748-ae24-79441a7c078e" containerName="mariadb-account-create" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.274345 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2a846e-642f-4748-ae24-79441a7c078e" containerName="mariadb-account-create" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.274614 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2a846e-642f-4748-ae24-79441a7c078e" containerName="mariadb-account-create" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.275435 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.284009 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-z4slf"] Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.285424 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.285456 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.285953 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj7ct" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.297178 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.432473 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-combined-ca-bundle\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.432944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgts\" (UniqueName: \"kubernetes.io/projected/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-kube-api-access-4zgts\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.433251 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-config-data\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.535859 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgts\" (UniqueName: \"kubernetes.io/projected/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-kube-api-access-4zgts\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.535912 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-config-data\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.535971 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-combined-ca-bundle\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.542034 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-config-data\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.543310 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-combined-ca-bundle\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.558273 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgts\" (UniqueName: \"kubernetes.io/projected/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-kube-api-access-4zgts\") pod \"keystone-db-sync-z4slf\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:58 crc kubenswrapper[4822]: I1010 07:50:58.595470 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z4slf" Oct 10 07:50:59 crc kubenswrapper[4822]: I1010 07:50:59.018983 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-z4slf"] Oct 10 07:50:59 crc kubenswrapper[4822]: I1010 07:50:59.236451 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z4slf" event={"ID":"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f","Type":"ContainerStarted","Data":"c34c3311fbff5423d0b699c2589e505f4863546d8b8a459d2e2fbba996f779b0"} Oct 10 07:50:59 crc kubenswrapper[4822]: I1010 07:50:59.236780 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z4slf" event={"ID":"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f","Type":"ContainerStarted","Data":"a143efcbe857f7b206052ddff4e2c8ca6eb41e5c46f09abce77323c1639a83a8"} Oct 10 07:50:59 crc kubenswrapper[4822]: I1010 07:50:59.258244 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-z4slf" podStartSLOduration=1.258227459 podStartE2EDuration="1.258227459s" podCreationTimestamp="2025-10-10 07:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:50:59.255430568 +0000 UTC m=+5206.350588774" watchObservedRunningTime="2025-10-10 07:50:59.258227459 +0000 UTC m=+5206.353385645" Oct 10 07:51:01 crc kubenswrapper[4822]: I1010 07:51:01.251790 4822 generic.go:334] "Generic (PLEG): container finished" podID="e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" containerID="c34c3311fbff5423d0b699c2589e505f4863546d8b8a459d2e2fbba996f779b0" exitCode=0 Oct 10 07:51:01 crc kubenswrapper[4822]: I1010 07:51:01.251867 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z4slf" event={"ID":"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f","Type":"ContainerDied","Data":"c34c3311fbff5423d0b699c2589e505f4863546d8b8a459d2e2fbba996f779b0"} Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.661395 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z4slf" Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.806599 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-config-data\") pod \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.807101 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-combined-ca-bundle\") pod \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.807122 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zgts\" (UniqueName: \"kubernetes.io/projected/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-kube-api-access-4zgts\") pod \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\" (UID: \"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f\") " Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.812431 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-kube-api-access-4zgts" (OuterVolumeSpecName: "kube-api-access-4zgts") pod "e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" (UID: "e0a44420-0ca2-4319-bb9f-a7dad4d5f40f"). InnerVolumeSpecName "kube-api-access-4zgts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.830028 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" (UID: "e0a44420-0ca2-4319-bb9f-a7dad4d5f40f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.852825 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-config-data" (OuterVolumeSpecName: "config-data") pod "e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" (UID: "e0a44420-0ca2-4319-bb9f-a7dad4d5f40f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.908979 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.909012 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zgts\" (UniqueName: \"kubernetes.io/projected/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-kube-api-access-4zgts\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:02 crc kubenswrapper[4822]: I1010 07:51:02.909022 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.273513 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-z4slf" event={"ID":"e0a44420-0ca2-4319-bb9f-a7dad4d5f40f","Type":"ContainerDied","Data":"a143efcbe857f7b206052ddff4e2c8ca6eb41e5c46f09abce77323c1639a83a8"} Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.273577 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a143efcbe857f7b206052ddff4e2c8ca6eb41e5c46f09abce77323c1639a83a8" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.273660 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-z4slf" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.550947 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94689f579-tr7jd"] Oct 10 07:51:03 crc kubenswrapper[4822]: E1010 07:51:03.551445 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" containerName="keystone-db-sync" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.551467 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" containerName="keystone-db-sync" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.551664 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" containerName="keystone-db-sync" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.552770 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.576770 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pg4wv"] Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.578097 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.581885 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.582047 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.582251 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj7ct" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.582417 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.604882 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94689f579-tr7jd"] Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.619588 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pg4wv"] Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.724367 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-config\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.724972 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-nb\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725005 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-credential-keys\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725026 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9hzx\" (UniqueName: \"kubernetes.io/projected/dd881d27-b7f4-4075-b27b-85a1b46b7942-kube-api-access-v9hzx\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725041 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-sb\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725110 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-dns-svc\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725138 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-fernet-keys\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725154 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-config-data\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725194 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwbzm\" (UniqueName: \"kubernetes.io/projected/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-kube-api-access-wwbzm\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725254 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-combined-ca-bundle\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.725282 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-scripts\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.826654 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-combined-ca-bundle\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.826748 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-scripts\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.826869 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-config\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.826916 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-nb\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.826974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9hzx\" (UniqueName: \"kubernetes.io/projected/dd881d27-b7f4-4075-b27b-85a1b46b7942-kube-api-access-v9hzx\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.827003 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-sb\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.827035 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-credential-keys\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.827073 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-dns-svc\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.827132 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-fernet-keys\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.827168 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-config-data\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.827222 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwbzm\" (UniqueName: \"kubernetes.io/projected/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-kube-api-access-wwbzm\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.828351 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-nb\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.831542 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-scripts\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.832068 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-combined-ca-bundle\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.837863 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-sb\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.838023 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-config\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.841452 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-dns-svc\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.849902 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-credential-keys\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.852326 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-config-data\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.852448 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-fernet-keys\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.862917 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwbzm\" (UniqueName: \"kubernetes.io/projected/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-kube-api-access-wwbzm\") pod \"dnsmasq-dns-94689f579-tr7jd\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.865252 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9hzx\" (UniqueName: \"kubernetes.io/projected/dd881d27-b7f4-4075-b27b-85a1b46b7942-kube-api-access-v9hzx\") pod \"keystone-bootstrap-pg4wv\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.880297 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:03 crc kubenswrapper[4822]: I1010 07:51:03.907880 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:04 crc kubenswrapper[4822]: I1010 07:51:04.461186 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94689f579-tr7jd"] Oct 10 07:51:04 crc kubenswrapper[4822]: I1010 07:51:04.469869 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pg4wv"] Oct 10 07:51:04 crc kubenswrapper[4822]: W1010 07:51:04.478046 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd881d27_b7f4_4075_b27b_85a1b46b7942.slice/crio-d5485887d2a95fff1a8da661751ac4f7bde1dcc12c2b6b30e9b506c6c469f365 WatchSource:0}: Error finding container d5485887d2a95fff1a8da661751ac4f7bde1dcc12c2b6b30e9b506c6c469f365: Status 404 returned error can't find the container with id d5485887d2a95fff1a8da661751ac4f7bde1dcc12c2b6b30e9b506c6c469f365 Oct 10 07:51:05 crc kubenswrapper[4822]: I1010 07:51:05.294550 4822 generic.go:334] "Generic (PLEG): container finished" podID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerID="fb9776a75b26d84fbe2b85493241ee160c3a16625b6a7b4d4da9ac9b3d8b8b3d" exitCode=0 Oct 10 07:51:05 crc kubenswrapper[4822]: I1010 07:51:05.294615 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94689f579-tr7jd" event={"ID":"42ddc26f-23b6-4a5f-838e-5dd6cc566fce","Type":"ContainerDied","Data":"fb9776a75b26d84fbe2b85493241ee160c3a16625b6a7b4d4da9ac9b3d8b8b3d"} Oct 10 07:51:05 crc kubenswrapper[4822]: I1010 07:51:05.294899 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94689f579-tr7jd" event={"ID":"42ddc26f-23b6-4a5f-838e-5dd6cc566fce","Type":"ContainerStarted","Data":"e7ad8674fa78b014d1831215415cb65eb54401e8122df6b2b4c56555e23a32d8"} Oct 10 07:51:05 crc kubenswrapper[4822]: I1010 07:51:05.297741 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pg4wv" event={"ID":"dd881d27-b7f4-4075-b27b-85a1b46b7942","Type":"ContainerStarted","Data":"27206f97264088f99cacbdf8b70d80b2a3e96d441b9b61300877741e6d97930b"} Oct 10 07:51:05 crc kubenswrapper[4822]: I1010 07:51:05.297818 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pg4wv" event={"ID":"dd881d27-b7f4-4075-b27b-85a1b46b7942","Type":"ContainerStarted","Data":"d5485887d2a95fff1a8da661751ac4f7bde1dcc12c2b6b30e9b506c6c469f365"} Oct 10 07:51:05 crc kubenswrapper[4822]: I1010 07:51:05.339250 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pg4wv" podStartSLOduration=2.339224056 podStartE2EDuration="2.339224056s" podCreationTimestamp="2025-10-10 07:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:51:05.332655187 +0000 UTC m=+5212.427813413" watchObservedRunningTime="2025-10-10 07:51:05.339224056 +0000 UTC m=+5212.434382252" Oct 10 07:51:06 crc kubenswrapper[4822]: I1010 07:51:06.311817 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94689f579-tr7jd" event={"ID":"42ddc26f-23b6-4a5f-838e-5dd6cc566fce","Type":"ContainerStarted","Data":"154f29b6bd2e8bac3126f853023f37a59da76a70bbe4bf16e546b5fe79f434f4"} Oct 10 07:51:06 crc kubenswrapper[4822]: I1010 07:51:06.312111 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:06 crc kubenswrapper[4822]: I1010 07:51:06.334010 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94689f579-tr7jd" podStartSLOduration=3.333996786 podStartE2EDuration="3.333996786s" podCreationTimestamp="2025-10-10 07:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:51:06.333906294 +0000 UTC m=+5213.429064490" watchObservedRunningTime="2025-10-10 07:51:06.333996786 +0000 UTC m=+5213.429154982" Oct 10 07:51:08 crc kubenswrapper[4822]: I1010 07:51:08.330516 4822 generic.go:334] "Generic (PLEG): container finished" podID="dd881d27-b7f4-4075-b27b-85a1b46b7942" containerID="27206f97264088f99cacbdf8b70d80b2a3e96d441b9b61300877741e6d97930b" exitCode=0 Oct 10 07:51:08 crc kubenswrapper[4822]: I1010 07:51:08.330571 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pg4wv" event={"ID":"dd881d27-b7f4-4075-b27b-85a1b46b7942","Type":"ContainerDied","Data":"27206f97264088f99cacbdf8b70d80b2a3e96d441b9b61300877741e6d97930b"} Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.669878 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.849688 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-credential-keys\") pod \"dd881d27-b7f4-4075-b27b-85a1b46b7942\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.849840 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9hzx\" (UniqueName: \"kubernetes.io/projected/dd881d27-b7f4-4075-b27b-85a1b46b7942-kube-api-access-v9hzx\") pod \"dd881d27-b7f4-4075-b27b-85a1b46b7942\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.849866 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-fernet-keys\") pod \"dd881d27-b7f4-4075-b27b-85a1b46b7942\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.849895 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-combined-ca-bundle\") pod \"dd881d27-b7f4-4075-b27b-85a1b46b7942\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.849938 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-config-data\") pod \"dd881d27-b7f4-4075-b27b-85a1b46b7942\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.849967 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-scripts\") pod \"dd881d27-b7f4-4075-b27b-85a1b46b7942\" (UID: \"dd881d27-b7f4-4075-b27b-85a1b46b7942\") " Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.856144 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-scripts" (OuterVolumeSpecName: "scripts") pod "dd881d27-b7f4-4075-b27b-85a1b46b7942" (UID: "dd881d27-b7f4-4075-b27b-85a1b46b7942"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.856442 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd881d27-b7f4-4075-b27b-85a1b46b7942-kube-api-access-v9hzx" (OuterVolumeSpecName: "kube-api-access-v9hzx") pod "dd881d27-b7f4-4075-b27b-85a1b46b7942" (UID: "dd881d27-b7f4-4075-b27b-85a1b46b7942"). InnerVolumeSpecName "kube-api-access-v9hzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.857159 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd881d27-b7f4-4075-b27b-85a1b46b7942" (UID: "dd881d27-b7f4-4075-b27b-85a1b46b7942"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.858768 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dd881d27-b7f4-4075-b27b-85a1b46b7942" (UID: "dd881d27-b7f4-4075-b27b-85a1b46b7942"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.881309 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd881d27-b7f4-4075-b27b-85a1b46b7942" (UID: "dd881d27-b7f4-4075-b27b-85a1b46b7942"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.890235 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-config-data" (OuterVolumeSpecName: "config-data") pod "dd881d27-b7f4-4075-b27b-85a1b46b7942" (UID: "dd881d27-b7f4-4075-b27b-85a1b46b7942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.951982 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.952018 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.952028 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.952037 4822 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.952046 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9hzx\" (UniqueName: \"kubernetes.io/projected/dd881d27-b7f4-4075-b27b-85a1b46b7942-kube-api-access-v9hzx\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:09 crc kubenswrapper[4822]: I1010 07:51:09.952057 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd881d27-b7f4-4075-b27b-85a1b46b7942-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.347748 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pg4wv" event={"ID":"dd881d27-b7f4-4075-b27b-85a1b46b7942","Type":"ContainerDied","Data":"d5485887d2a95fff1a8da661751ac4f7bde1dcc12c2b6b30e9b506c6c469f365"} Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.348135 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5485887d2a95fff1a8da661751ac4f7bde1dcc12c2b6b30e9b506c6c469f365" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.347900 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pg4wv" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.421476 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pg4wv"] Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.428522 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pg4wv"] Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.517323 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vrz4n"] Oct 10 07:51:10 crc kubenswrapper[4822]: E1010 07:51:10.517917 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd881d27-b7f4-4075-b27b-85a1b46b7942" containerName="keystone-bootstrap" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.517943 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd881d27-b7f4-4075-b27b-85a1b46b7942" containerName="keystone-bootstrap" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.518130 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd881d27-b7f4-4075-b27b-85a1b46b7942" containerName="keystone-bootstrap" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.518664 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.521040 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.521345 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.523794 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj7ct" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.523970 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.532274 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vrz4n"] Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.561032 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-config-data\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.561074 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-fernet-keys\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.561112 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhgw\" (UniqueName: \"kubernetes.io/projected/4078fd0b-9003-4f1a-b877-c05a5b5752fa-kube-api-access-5dhgw\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.561135 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-combined-ca-bundle\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.561156 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-credential-keys\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.561403 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-scripts\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.650334 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:51:10 crc kubenswrapper[4822]: E1010 07:51:10.650607 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.662210 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-config-data\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.662247 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-fernet-keys\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.662310 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhgw\" (UniqueName: \"kubernetes.io/projected/4078fd0b-9003-4f1a-b877-c05a5b5752fa-kube-api-access-5dhgw\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.662332 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-combined-ca-bundle\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.662360 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-credential-keys\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.662434 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-scripts\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.666249 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-combined-ca-bundle\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.667014 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-config-data\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.669372 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-credential-keys\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.673500 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-scripts\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.675720 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-fernet-keys\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.682531 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhgw\" (UniqueName: \"kubernetes.io/projected/4078fd0b-9003-4f1a-b877-c05a5b5752fa-kube-api-access-5dhgw\") pod \"keystone-bootstrap-vrz4n\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:10 crc kubenswrapper[4822]: I1010 07:51:10.836588 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:11 crc kubenswrapper[4822]: I1010 07:51:11.305218 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vrz4n"] Oct 10 07:51:11 crc kubenswrapper[4822]: I1010 07:51:11.355376 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vrz4n" event={"ID":"4078fd0b-9003-4f1a-b877-c05a5b5752fa","Type":"ContainerStarted","Data":"363b5ae3f455bab73cd0610e4d226331cc55352f4568a845e1af021dc0d6d497"} Oct 10 07:51:11 crc kubenswrapper[4822]: I1010 07:51:11.662738 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd881d27-b7f4-4075-b27b-85a1b46b7942" path="/var/lib/kubelet/pods/dd881d27-b7f4-4075-b27b-85a1b46b7942/volumes" Oct 10 07:51:12 crc kubenswrapper[4822]: I1010 07:51:12.368366 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vrz4n" event={"ID":"4078fd0b-9003-4f1a-b877-c05a5b5752fa","Type":"ContainerStarted","Data":"a4b792516e305e692468c11066d2dc5008de1c6d34455bf127486abe1dbb99e6"} Oct 10 07:51:12 crc kubenswrapper[4822]: I1010 07:51:12.394894 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vrz4n" podStartSLOduration=2.394864422 podStartE2EDuration="2.394864422s" podCreationTimestamp="2025-10-10 07:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:51:12.384143783 +0000 UTC m=+5219.479302039" watchObservedRunningTime="2025-10-10 07:51:12.394864422 +0000 UTC m=+5219.490022628" Oct 10 07:51:13 crc kubenswrapper[4822]: I1010 07:51:13.883060 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:51:13 crc kubenswrapper[4822]: I1010 07:51:13.951552 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bfbd688f-x4kpx"] Oct 10 07:51:13 crc kubenswrapper[4822]: I1010 07:51:13.951846 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerName="dnsmasq-dns" containerID="cri-o://a671c82969259ce74e8ff03edfed8a217d91a3ca469284a9ca759d669a88c8a7" gracePeriod=10 Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.396874 4822 generic.go:334] "Generic (PLEG): container finished" podID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerID="a671c82969259ce74e8ff03edfed8a217d91a3ca469284a9ca759d669a88c8a7" exitCode=0 Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.396974 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" event={"ID":"f0c5450b-1686-49f2-abd2-22a6199e218b","Type":"ContainerDied","Data":"a671c82969259ce74e8ff03edfed8a217d91a3ca469284a9ca759d669a88c8a7"} Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.397193 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" event={"ID":"f0c5450b-1686-49f2-abd2-22a6199e218b","Type":"ContainerDied","Data":"384b1f1bebe15c5ae04ce96ca78f50afd147f2f52d4407f73402ead559c4df40"} Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.397209 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384b1f1bebe15c5ae04ce96ca78f50afd147f2f52d4407f73402ead559c4df40" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.398690 4822 generic.go:334] "Generic (PLEG): container finished" podID="4078fd0b-9003-4f1a-b877-c05a5b5752fa" containerID="a4b792516e305e692468c11066d2dc5008de1c6d34455bf127486abe1dbb99e6" exitCode=0 Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.398738 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vrz4n" event={"ID":"4078fd0b-9003-4f1a-b877-c05a5b5752fa","Type":"ContainerDied","Data":"a4b792516e305e692468c11066d2dc5008de1c6d34455bf127486abe1dbb99e6"} Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.432773 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.447715 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-sb\") pod \"f0c5450b-1686-49f2-abd2-22a6199e218b\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.447820 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-nb\") pod \"f0c5450b-1686-49f2-abd2-22a6199e218b\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.447860 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msgzh\" (UniqueName: \"kubernetes.io/projected/f0c5450b-1686-49f2-abd2-22a6199e218b-kube-api-access-msgzh\") pod \"f0c5450b-1686-49f2-abd2-22a6199e218b\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.447893 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-dns-svc\") pod \"f0c5450b-1686-49f2-abd2-22a6199e218b\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.447935 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-config\") pod \"f0c5450b-1686-49f2-abd2-22a6199e218b\" (UID: \"f0c5450b-1686-49f2-abd2-22a6199e218b\") " Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.453521 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c5450b-1686-49f2-abd2-22a6199e218b-kube-api-access-msgzh" (OuterVolumeSpecName: "kube-api-access-msgzh") pod "f0c5450b-1686-49f2-abd2-22a6199e218b" (UID: "f0c5450b-1686-49f2-abd2-22a6199e218b"). InnerVolumeSpecName "kube-api-access-msgzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.491074 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0c5450b-1686-49f2-abd2-22a6199e218b" (UID: "f0c5450b-1686-49f2-abd2-22a6199e218b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.501445 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0c5450b-1686-49f2-abd2-22a6199e218b" (UID: "f0c5450b-1686-49f2-abd2-22a6199e218b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.504979 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-config" (OuterVolumeSpecName: "config") pod "f0c5450b-1686-49f2-abd2-22a6199e218b" (UID: "f0c5450b-1686-49f2-abd2-22a6199e218b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.508383 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0c5450b-1686-49f2-abd2-22a6199e218b" (UID: "f0c5450b-1686-49f2-abd2-22a6199e218b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.550270 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.550304 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.550317 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msgzh\" (UniqueName: \"kubernetes.io/projected/f0c5450b-1686-49f2-abd2-22a6199e218b-kube-api-access-msgzh\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.550331 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:14 crc kubenswrapper[4822]: I1010 07:51:14.550341 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c5450b-1686-49f2-abd2-22a6199e218b-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.409860 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bfbd688f-x4kpx" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.447761 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bfbd688f-x4kpx"] Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.452865 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bfbd688f-x4kpx"] Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.670056 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" path="/var/lib/kubelet/pods/f0c5450b-1686-49f2-abd2-22a6199e218b/volumes" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.693521 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.768586 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-combined-ca-bundle\") pod \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.768638 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-scripts\") pod \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.768906 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-fernet-keys\") pod \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.768928 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-config-data\") pod \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.768960 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-credential-keys\") pod \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.768993 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dhgw\" (UniqueName: \"kubernetes.io/projected/4078fd0b-9003-4f1a-b877-c05a5b5752fa-kube-api-access-5dhgw\") pod \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\" (UID: \"4078fd0b-9003-4f1a-b877-c05a5b5752fa\") " Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.773126 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4078fd0b-9003-4f1a-b877-c05a5b5752fa" (UID: "4078fd0b-9003-4f1a-b877-c05a5b5752fa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.773176 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4078fd0b-9003-4f1a-b877-c05a5b5752fa" (UID: "4078fd0b-9003-4f1a-b877-c05a5b5752fa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.773174 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4078fd0b-9003-4f1a-b877-c05a5b5752fa-kube-api-access-5dhgw" (OuterVolumeSpecName: "kube-api-access-5dhgw") pod "4078fd0b-9003-4f1a-b877-c05a5b5752fa" (UID: "4078fd0b-9003-4f1a-b877-c05a5b5752fa"). InnerVolumeSpecName "kube-api-access-5dhgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.784003 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-scripts" (OuterVolumeSpecName: "scripts") pod "4078fd0b-9003-4f1a-b877-c05a5b5752fa" (UID: "4078fd0b-9003-4f1a-b877-c05a5b5752fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.791411 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-config-data" (OuterVolumeSpecName: "config-data") pod "4078fd0b-9003-4f1a-b877-c05a5b5752fa" (UID: "4078fd0b-9003-4f1a-b877-c05a5b5752fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.791781 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4078fd0b-9003-4f1a-b877-c05a5b5752fa" (UID: "4078fd0b-9003-4f1a-b877-c05a5b5752fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.870536 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.870972 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.870992 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.871007 4822 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.871025 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dhgw\" (UniqueName: \"kubernetes.io/projected/4078fd0b-9003-4f1a-b877-c05a5b5752fa-kube-api-access-5dhgw\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:15 crc kubenswrapper[4822]: I1010 07:51:15.871041 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4078fd0b-9003-4f1a-b877-c05a5b5752fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.420038 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vrz4n" event={"ID":"4078fd0b-9003-4f1a-b877-c05a5b5752fa","Type":"ContainerDied","Data":"363b5ae3f455bab73cd0610e4d226331cc55352f4568a845e1af021dc0d6d497"} Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.420082 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="363b5ae3f455bab73cd0610e4d226331cc55352f4568a845e1af021dc0d6d497" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.420150 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vrz4n" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.587420 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fbf965899-hwsfk"] Oct 10 07:51:16 crc kubenswrapper[4822]: E1010 07:51:16.587858 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerName="init" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.587881 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerName="init" Oct 10 07:51:16 crc kubenswrapper[4822]: E1010 07:51:16.587904 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerName="dnsmasq-dns" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.587912 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerName="dnsmasq-dns" Oct 10 07:51:16 crc kubenswrapper[4822]: E1010 07:51:16.587928 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4078fd0b-9003-4f1a-b877-c05a5b5752fa" containerName="keystone-bootstrap" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.587938 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4078fd0b-9003-4f1a-b877-c05a5b5752fa" containerName="keystone-bootstrap" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.588136 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4078fd0b-9003-4f1a-b877-c05a5b5752fa" containerName="keystone-bootstrap" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.588161 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c5450b-1686-49f2-abd2-22a6199e218b" containerName="dnsmasq-dns" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.588862 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.591100 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.591397 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pj7ct" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.591679 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.598171 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fbf965899-hwsfk"] Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.600542 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.682706 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fdl\" (UniqueName: \"kubernetes.io/projected/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-kube-api-access-p6fdl\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.682930 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-combined-ca-bundle\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.683536 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-config-data\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.683564 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-scripts\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.683680 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-fernet-keys\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.683705 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-credential-keys\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.785225 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fdl\" (UniqueName: \"kubernetes.io/projected/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-kube-api-access-p6fdl\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.785305 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-combined-ca-bundle\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.785323 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-config-data\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.785339 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-scripts\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.785373 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-fernet-keys\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.785392 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-credential-keys\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.789011 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-scripts\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.789539 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-credential-keys\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.789649 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-combined-ca-bundle\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.790471 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-config-data\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.792364 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-fernet-keys\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.802879 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fdl\" (UniqueName: \"kubernetes.io/projected/dd9d8f27-6be6-48d7-836b-a3bc2594abe3-kube-api-access-p6fdl\") pod \"keystone-5fbf965899-hwsfk\" (UID: \"dd9d8f27-6be6-48d7-836b-a3bc2594abe3\") " pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:16 crc kubenswrapper[4822]: I1010 07:51:16.912261 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:17 crc kubenswrapper[4822]: I1010 07:51:17.390894 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fbf965899-hwsfk"] Oct 10 07:51:17 crc kubenswrapper[4822]: W1010 07:51:17.396996 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd9d8f27_6be6_48d7_836b_a3bc2594abe3.slice/crio-8def9590e8224ac7ac7e9f7f260759759d096b33dee36d61d9d3f46f40aee33f WatchSource:0}: Error finding container 8def9590e8224ac7ac7e9f7f260759759d096b33dee36d61d9d3f46f40aee33f: Status 404 returned error can't find the container with id 8def9590e8224ac7ac7e9f7f260759759d096b33dee36d61d9d3f46f40aee33f Oct 10 07:51:17 crc kubenswrapper[4822]: I1010 07:51:17.427258 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fbf965899-hwsfk" event={"ID":"dd9d8f27-6be6-48d7-836b-a3bc2594abe3","Type":"ContainerStarted","Data":"8def9590e8224ac7ac7e9f7f260759759d096b33dee36d61d9d3f46f40aee33f"} Oct 10 07:51:18 crc kubenswrapper[4822]: I1010 07:51:18.436063 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fbf965899-hwsfk" event={"ID":"dd9d8f27-6be6-48d7-836b-a3bc2594abe3","Type":"ContainerStarted","Data":"5d2b41eb92e8cd6aaf2d2cea219f0aaea59c2768908d296e21875bc62f50b690"} Oct 10 07:51:18 crc kubenswrapper[4822]: I1010 07:51:18.436462 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:18 crc kubenswrapper[4822]: I1010 07:51:18.457028 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5fbf965899-hwsfk" podStartSLOduration=2.457001204 podStartE2EDuration="2.457001204s" podCreationTimestamp="2025-10-10 07:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:51:18.450823636 +0000 UTC m=+5225.545981862" watchObservedRunningTime="2025-10-10 07:51:18.457001204 +0000 UTC m=+5225.552159410" Oct 10 07:51:25 crc kubenswrapper[4822]: I1010 07:51:25.650314 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:51:25 crc kubenswrapper[4822]: E1010 07:51:25.650584 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:51:36 crc kubenswrapper[4822]: I1010 07:51:36.650415 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:51:36 crc kubenswrapper[4822]: E1010 07:51:36.651253 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:51:48 crc kubenswrapper[4822]: I1010 07:51:48.315319 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5fbf965899-hwsfk" Oct 10 07:51:51 crc kubenswrapper[4822]: I1010 07:51:51.650426 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:51:51 crc kubenswrapper[4822]: E1010 07:51:51.652166 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.854637 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.856173 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.859093 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.860975 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.861135 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-thzkj" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.875904 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.914877 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:52 crc kubenswrapper[4822]: E1010 07:51:52.915678 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-j4z8w openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="93f8af66-350d-4eff-a3fa-167257311f12" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.921846 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.930148 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config-secret\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.930281 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:52 crc kubenswrapper[4822]: I1010 07:51:52.930357 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4z8w\" (UniqueName: \"kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.032436 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config-secret\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.032637 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.032725 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4z8w\" (UniqueName: \"kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.034448 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: E1010 07:51:53.035674 4822 projected.go:194] Error preparing data for projected volume kube-api-access-j4z8w for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (93f8af66-350d-4eff-a3fa-167257311f12) does not match the UID in record. The object might have been deleted and then recreated Oct 10 07:51:53 crc kubenswrapper[4822]: E1010 07:51:53.035766 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w podName:93f8af66-350d-4eff-a3fa-167257311f12 nodeName:}" failed. No retries permitted until 2025-10-10 07:51:53.535742229 +0000 UTC m=+5260.630900435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j4z8w" (UniqueName: "kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w") pod "openstackclient" (UID: "93f8af66-350d-4eff-a3fa-167257311f12") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (93f8af66-350d-4eff-a3fa-167257311f12) does not match the UID in record. The object might have been deleted and then recreated Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.041776 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.042983 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config-secret\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.052177 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.054630 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.134632 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnflp\" (UniqueName: \"kubernetes.io/projected/102b4c01-91bf-4da7-a331-d4dad98f4d39-kube-api-access-fnflp\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.134715 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config-secret\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.135039 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.236916 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.237061 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnflp\" (UniqueName: \"kubernetes.io/projected/102b4c01-91bf-4da7-a331-d4dad98f4d39-kube-api-access-fnflp\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.237154 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config-secret\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.238068 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.247396 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config-secret\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.262229 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnflp\" (UniqueName: \"kubernetes.io/projected/102b4c01-91bf-4da7-a331-d4dad98f4d39-kube-api-access-fnflp\") pod \"openstackclient\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.394939 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.548845 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4z8w\" (UniqueName: \"kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w\") pod \"openstackclient\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: E1010 07:51:53.551503 4822 projected.go:194] Error preparing data for projected volume kube-api-access-j4z8w for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (93f8af66-350d-4eff-a3fa-167257311f12) does not match the UID in record. The object might have been deleted and then recreated Oct 10 07:51:53 crc kubenswrapper[4822]: E1010 07:51:53.551565 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w podName:93f8af66-350d-4eff-a3fa-167257311f12 nodeName:}" failed. No retries permitted until 2025-10-10 07:51:54.551545491 +0000 UTC m=+5261.646703687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-j4z8w" (UniqueName: "kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w") pod "openstackclient" (UID: "93f8af66-350d-4eff-a3fa-167257311f12") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (93f8af66-350d-4eff-a3fa-167257311f12) does not match the UID in record. The object might have been deleted and then recreated Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.754448 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.768460 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.772631 4822 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="93f8af66-350d-4eff-a3fa-167257311f12" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.839367 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 07:51:53 crc kubenswrapper[4822]: W1010 07:51:53.846722 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod102b4c01_91bf_4da7_a331_d4dad98f4d39.slice/crio-804851c4e576e0eeabf20b1f9e80ac452ac442dd010355845da757f6343eb009 WatchSource:0}: Error finding container 804851c4e576e0eeabf20b1f9e80ac452ac442dd010355845da757f6343eb009: Status 404 returned error can't find the container with id 804851c4e576e0eeabf20b1f9e80ac452ac442dd010355845da757f6343eb009 Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.857027 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config\") pod \"93f8af66-350d-4eff-a3fa-167257311f12\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.857295 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config-secret\") pod \"93f8af66-350d-4eff-a3fa-167257311f12\" (UID: \"93f8af66-350d-4eff-a3fa-167257311f12\") " Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.857841 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "93f8af66-350d-4eff-a3fa-167257311f12" (UID: "93f8af66-350d-4eff-a3fa-167257311f12"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.858071 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.858155 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4z8w\" (UniqueName: \"kubernetes.io/projected/93f8af66-350d-4eff-a3fa-167257311f12-kube-api-access-j4z8w\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.860051 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "93f8af66-350d-4eff-a3fa-167257311f12" (UID: "93f8af66-350d-4eff-a3fa-167257311f12"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:51:53 crc kubenswrapper[4822]: I1010 07:51:53.959880 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/93f8af66-350d-4eff-a3fa-167257311f12-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:51:54 crc kubenswrapper[4822]: I1010 07:51:54.770039 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"102b4c01-91bf-4da7-a331-d4dad98f4d39","Type":"ContainerStarted","Data":"4eb2c3ceac709c2141c39793b4ca227d98d49acaa877520386a9653a3d5f9e0a"} Oct 10 07:51:54 crc kubenswrapper[4822]: I1010 07:51:54.770469 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"102b4c01-91bf-4da7-a331-d4dad98f4d39","Type":"ContainerStarted","Data":"804851c4e576e0eeabf20b1f9e80ac452ac442dd010355845da757f6343eb009"} Oct 10 07:51:54 crc kubenswrapper[4822]: I1010 07:51:54.770066 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:51:54 crc kubenswrapper[4822]: I1010 07:51:54.797620 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.797604417 podStartE2EDuration="2.797604417s" podCreationTimestamp="2025-10-10 07:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:51:54.795827556 +0000 UTC m=+5261.890985802" watchObservedRunningTime="2025-10-10 07:51:54.797604417 +0000 UTC m=+5261.892762613" Oct 10 07:51:54 crc kubenswrapper[4822]: I1010 07:51:54.799965 4822 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="93f8af66-350d-4eff-a3fa-167257311f12" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" Oct 10 07:51:55 crc kubenswrapper[4822]: I1010 07:51:55.665569 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f8af66-350d-4eff-a3fa-167257311f12" path="/var/lib/kubelet/pods/93f8af66-350d-4eff-a3fa-167257311f12/volumes" Oct 10 07:52:03 crc kubenswrapper[4822]: I1010 07:52:03.661554 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:52:03 crc kubenswrapper[4822]: E1010 07:52:03.663476 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:52:05 crc kubenswrapper[4822]: I1010 07:52:05.973360 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqd5w"] Oct 10 07:52:05 crc kubenswrapper[4822]: I1010 07:52:05.975452 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:05 crc kubenswrapper[4822]: I1010 07:52:05.986529 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqd5w"] Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.098912 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-catalog-content\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.099084 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw7wl\" (UniqueName: \"kubernetes.io/projected/d3124365-0581-478e-bf43-2aa7f1f434c4-kube-api-access-nw7wl\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.099127 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-utilities\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.201216 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw7wl\" (UniqueName: \"kubernetes.io/projected/d3124365-0581-478e-bf43-2aa7f1f434c4-kube-api-access-nw7wl\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.201269 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-utilities\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.201363 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-catalog-content\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.202038 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-utilities\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.202600 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-catalog-content\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.222166 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw7wl\" (UniqueName: \"kubernetes.io/projected/d3124365-0581-478e-bf43-2aa7f1f434c4-kube-api-access-nw7wl\") pod \"community-operators-sqd5w\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.304567 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.785514 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqd5w"] Oct 10 07:52:06 crc kubenswrapper[4822]: I1010 07:52:06.896003 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqd5w" event={"ID":"d3124365-0581-478e-bf43-2aa7f1f434c4","Type":"ContainerStarted","Data":"61acf38d733864db7bda8604199f426a079e97f07db301120ad419bb2581edb9"} Oct 10 07:52:07 crc kubenswrapper[4822]: I1010 07:52:07.908589 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqd5w" event={"ID":"d3124365-0581-478e-bf43-2aa7f1f434c4","Type":"ContainerDied","Data":"cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412"} Oct 10 07:52:07 crc kubenswrapper[4822]: I1010 07:52:07.909497 4822 generic.go:334] "Generic (PLEG): container finished" podID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerID="cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412" exitCode=0 Oct 10 07:52:07 crc kubenswrapper[4822]: I1010 07:52:07.912220 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:52:08 crc kubenswrapper[4822]: I1010 07:52:08.924927 4822 generic.go:334] "Generic (PLEG): container finished" podID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerID="bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb" exitCode=0 Oct 10 07:52:08 crc kubenswrapper[4822]: I1010 07:52:08.925050 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqd5w" event={"ID":"d3124365-0581-478e-bf43-2aa7f1f434c4","Type":"ContainerDied","Data":"bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb"} Oct 10 07:52:09 crc kubenswrapper[4822]: I1010 07:52:09.937593 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqd5w" event={"ID":"d3124365-0581-478e-bf43-2aa7f1f434c4","Type":"ContainerStarted","Data":"817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b"} Oct 10 07:52:09 crc kubenswrapper[4822]: I1010 07:52:09.975146 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqd5w" podStartSLOduration=3.571153322 podStartE2EDuration="4.975125582s" podCreationTimestamp="2025-10-10 07:52:05 +0000 UTC" firstStartedPulling="2025-10-10 07:52:07.91138982 +0000 UTC m=+5275.006548016" lastFinishedPulling="2025-10-10 07:52:09.31536207 +0000 UTC m=+5276.410520276" observedRunningTime="2025-10-10 07:52:09.964961839 +0000 UTC m=+5277.060120075" watchObservedRunningTime="2025-10-10 07:52:09.975125582 +0000 UTC m=+5277.070283798" Oct 10 07:52:16 crc kubenswrapper[4822]: I1010 07:52:16.304818 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:16 crc kubenswrapper[4822]: I1010 07:52:16.305438 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:16 crc kubenswrapper[4822]: I1010 07:52:16.367924 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:17 crc kubenswrapper[4822]: I1010 07:52:17.051477 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:17 crc kubenswrapper[4822]: I1010 07:52:17.113998 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqd5w"] Oct 10 07:52:18 crc kubenswrapper[4822]: I1010 07:52:18.650454 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:52:18 crc kubenswrapper[4822]: E1010 07:52:18.651047 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.025078 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sqd5w" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="registry-server" containerID="cri-o://817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b" gracePeriod=2 Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.487870 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.548533 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw7wl\" (UniqueName: \"kubernetes.io/projected/d3124365-0581-478e-bf43-2aa7f1f434c4-kube-api-access-nw7wl\") pod \"d3124365-0581-478e-bf43-2aa7f1f434c4\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.548691 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-catalog-content\") pod \"d3124365-0581-478e-bf43-2aa7f1f434c4\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.548793 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-utilities\") pod \"d3124365-0581-478e-bf43-2aa7f1f434c4\" (UID: \"d3124365-0581-478e-bf43-2aa7f1f434c4\") " Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.550155 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-utilities" (OuterVolumeSpecName: "utilities") pod "d3124365-0581-478e-bf43-2aa7f1f434c4" (UID: "d3124365-0581-478e-bf43-2aa7f1f434c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.554446 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3124365-0581-478e-bf43-2aa7f1f434c4-kube-api-access-nw7wl" (OuterVolumeSpecName: "kube-api-access-nw7wl") pod "d3124365-0581-478e-bf43-2aa7f1f434c4" (UID: "d3124365-0581-478e-bf43-2aa7f1f434c4"). InnerVolumeSpecName "kube-api-access-nw7wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.615662 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3124365-0581-478e-bf43-2aa7f1f434c4" (UID: "d3124365-0581-478e-bf43-2aa7f1f434c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.650423 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw7wl\" (UniqueName: \"kubernetes.io/projected/d3124365-0581-478e-bf43-2aa7f1f434c4-kube-api-access-nw7wl\") on node \"crc\" DevicePath \"\"" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.650517 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:52:19 crc kubenswrapper[4822]: I1010 07:52:19.650535 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3124365-0581-478e-bf43-2aa7f1f434c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.039520 4822 generic.go:334] "Generic (PLEG): container finished" podID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerID="817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b" exitCode=0 Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.039604 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqd5w" event={"ID":"d3124365-0581-478e-bf43-2aa7f1f434c4","Type":"ContainerDied","Data":"817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b"} Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.039727 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqd5w" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.039941 4822 scope.go:117] "RemoveContainer" containerID="817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.039921 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqd5w" event={"ID":"d3124365-0581-478e-bf43-2aa7f1f434c4","Type":"ContainerDied","Data":"61acf38d733864db7bda8604199f426a079e97f07db301120ad419bb2581edb9"} Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.071218 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqd5w"] Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.073111 4822 scope.go:117] "RemoveContainer" containerID="bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.077479 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sqd5w"] Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.107765 4822 scope.go:117] "RemoveContainer" containerID="cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.142231 4822 scope.go:117] "RemoveContainer" containerID="817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b" Oct 10 07:52:20 crc kubenswrapper[4822]: E1010 07:52:20.143453 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b\": container with ID starting with 817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b not found: ID does not exist" containerID="817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.143492 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b"} err="failed to get container status \"817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b\": rpc error: code = NotFound desc = could not find container \"817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b\": container with ID starting with 817062e83e2ff400f7f1338c124c625b9ddf8d15baeddbde5642c0f39980e85b not found: ID does not exist" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.143539 4822 scope.go:117] "RemoveContainer" containerID="bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb" Oct 10 07:52:20 crc kubenswrapper[4822]: E1010 07:52:20.144713 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb\": container with ID starting with bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb not found: ID does not exist" containerID="bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.144744 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb"} err="failed to get container status \"bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb\": rpc error: code = NotFound desc = could not find container \"bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb\": container with ID starting with bdb9a752d288e6e9c9a9975f717747a7ba757e5f3b4c5199d57db95fa5948afb not found: ID does not exist" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.144762 4822 scope.go:117] "RemoveContainer" containerID="cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412" Oct 10 07:52:20 crc kubenswrapper[4822]: E1010 07:52:20.144982 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412\": container with ID starting with cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412 not found: ID does not exist" containerID="cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412" Oct 10 07:52:20 crc kubenswrapper[4822]: I1010 07:52:20.145007 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412"} err="failed to get container status \"cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412\": rpc error: code = NotFound desc = could not find container \"cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412\": container with ID starting with cda30985b106fd769f12dcf2900d58ed1140c8557ab8c517a655b7a0dbc6f412 not found: ID does not exist" Oct 10 07:52:21 crc kubenswrapper[4822]: I1010 07:52:21.682643 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" path="/var/lib/kubelet/pods/d3124365-0581-478e-bf43-2aa7f1f434c4/volumes" Oct 10 07:52:31 crc kubenswrapper[4822]: I1010 07:52:31.655549 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:52:31 crc kubenswrapper[4822]: E1010 07:52:31.656493 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.978271 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cbm57"] Oct 10 07:52:34 crc kubenswrapper[4822]: E1010 07:52:34.979047 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="registry-server" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.979066 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="registry-server" Oct 10 07:52:34 crc kubenswrapper[4822]: E1010 07:52:34.979087 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="extract-content" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.979094 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="extract-content" Oct 10 07:52:34 crc kubenswrapper[4822]: E1010 07:52:34.979106 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="extract-utilities" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.979115 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="extract-utilities" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.979341 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3124365-0581-478e-bf43-2aa7f1f434c4" containerName="registry-server" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.983314 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:34 crc kubenswrapper[4822]: I1010 07:52:34.996199 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbm57"] Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.043175 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-catalog-content\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.043552 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzx4\" (UniqueName: \"kubernetes.io/projected/d8322950-a9bd-414d-bc10-b0f52fed9c11-kube-api-access-8nzx4\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.043775 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-utilities\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.145646 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-catalog-content\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.145723 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzx4\" (UniqueName: \"kubernetes.io/projected/d8322950-a9bd-414d-bc10-b0f52fed9c11-kube-api-access-8nzx4\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.145825 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-utilities\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.146309 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-catalog-content\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.146442 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-utilities\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.174158 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzx4\" (UniqueName: \"kubernetes.io/projected/d8322950-a9bd-414d-bc10-b0f52fed9c11-kube-api-access-8nzx4\") pod \"redhat-marketplace-cbm57\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.355254 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:35 crc kubenswrapper[4822]: I1010 07:52:35.793223 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbm57"] Oct 10 07:52:36 crc kubenswrapper[4822]: I1010 07:52:36.200928 4822 generic.go:334] "Generic (PLEG): container finished" podID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerID="5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050" exitCode=0 Oct 10 07:52:36 crc kubenswrapper[4822]: I1010 07:52:36.201002 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbm57" event={"ID":"d8322950-a9bd-414d-bc10-b0f52fed9c11","Type":"ContainerDied","Data":"5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050"} Oct 10 07:52:36 crc kubenswrapper[4822]: I1010 07:52:36.201288 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbm57" event={"ID":"d8322950-a9bd-414d-bc10-b0f52fed9c11","Type":"ContainerStarted","Data":"32016913c40d5180ae06e74a081ab082f8f46970b413762bc00fa49913569ffd"} Oct 10 07:52:37 crc kubenswrapper[4822]: I1010 07:52:37.215914 4822 generic.go:334] "Generic (PLEG): container finished" podID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerID="e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d" exitCode=0 Oct 10 07:52:37 crc kubenswrapper[4822]: I1010 07:52:37.216321 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbm57" event={"ID":"d8322950-a9bd-414d-bc10-b0f52fed9c11","Type":"ContainerDied","Data":"e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d"} Oct 10 07:52:38 crc kubenswrapper[4822]: I1010 07:52:38.228272 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbm57" event={"ID":"d8322950-a9bd-414d-bc10-b0f52fed9c11","Type":"ContainerStarted","Data":"cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933"} Oct 10 07:52:38 crc kubenswrapper[4822]: I1010 07:52:38.253742 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cbm57" podStartSLOduration=2.830746397 podStartE2EDuration="4.253715954s" podCreationTimestamp="2025-10-10 07:52:34 +0000 UTC" firstStartedPulling="2025-10-10 07:52:36.203510902 +0000 UTC m=+5303.298669108" lastFinishedPulling="2025-10-10 07:52:37.626480429 +0000 UTC m=+5304.721638665" observedRunningTime="2025-10-10 07:52:38.24840282 +0000 UTC m=+5305.343561056" watchObservedRunningTime="2025-10-10 07:52:38.253715954 +0000 UTC m=+5305.348874190" Oct 10 07:52:42 crc kubenswrapper[4822]: I1010 07:52:42.650661 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:52:42 crc kubenswrapper[4822]: E1010 07:52:42.651591 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:52:45 crc kubenswrapper[4822]: I1010 07:52:45.356605 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:45 crc kubenswrapper[4822]: I1010 07:52:45.356965 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:45 crc kubenswrapper[4822]: I1010 07:52:45.408796 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:46 crc kubenswrapper[4822]: I1010 07:52:46.364569 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:46 crc kubenswrapper[4822]: I1010 07:52:46.411494 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbm57"] Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.342298 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cbm57" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="registry-server" containerID="cri-o://cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933" gracePeriod=2 Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.754950 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.790633 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-catalog-content\") pod \"d8322950-a9bd-414d-bc10-b0f52fed9c11\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.790687 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-utilities\") pod \"d8322950-a9bd-414d-bc10-b0f52fed9c11\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.790708 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nzx4\" (UniqueName: \"kubernetes.io/projected/d8322950-a9bd-414d-bc10-b0f52fed9c11-kube-api-access-8nzx4\") pod \"d8322950-a9bd-414d-bc10-b0f52fed9c11\" (UID: \"d8322950-a9bd-414d-bc10-b0f52fed9c11\") " Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.792383 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-utilities" (OuterVolumeSpecName: "utilities") pod "d8322950-a9bd-414d-bc10-b0f52fed9c11" (UID: "d8322950-a9bd-414d-bc10-b0f52fed9c11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.796479 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8322950-a9bd-414d-bc10-b0f52fed9c11-kube-api-access-8nzx4" (OuterVolumeSpecName: "kube-api-access-8nzx4") pod "d8322950-a9bd-414d-bc10-b0f52fed9c11" (UID: "d8322950-a9bd-414d-bc10-b0f52fed9c11"). InnerVolumeSpecName "kube-api-access-8nzx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.813509 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8322950-a9bd-414d-bc10-b0f52fed9c11" (UID: "d8322950-a9bd-414d-bc10-b0f52fed9c11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.892218 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.892296 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8322950-a9bd-414d-bc10-b0f52fed9c11-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:52:48 crc kubenswrapper[4822]: I1010 07:52:48.892306 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nzx4\" (UniqueName: \"kubernetes.io/projected/d8322950-a9bd-414d-bc10-b0f52fed9c11-kube-api-access-8nzx4\") on node \"crc\" DevicePath \"\"" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.372611 4822 generic.go:334] "Generic (PLEG): container finished" podID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerID="cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933" exitCode=0 Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.372677 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbm57" event={"ID":"d8322950-a9bd-414d-bc10-b0f52fed9c11","Type":"ContainerDied","Data":"cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933"} Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.373055 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbm57" event={"ID":"d8322950-a9bd-414d-bc10-b0f52fed9c11","Type":"ContainerDied","Data":"32016913c40d5180ae06e74a081ab082f8f46970b413762bc00fa49913569ffd"} Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.373088 4822 scope.go:117] "RemoveContainer" containerID="cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.372778 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbm57" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.400157 4822 scope.go:117] "RemoveContainer" containerID="e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.421050 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbm57"] Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.429665 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbm57"] Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.451382 4822 scope.go:117] "RemoveContainer" containerID="5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.472140 4822 scope.go:117] "RemoveContainer" containerID="cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933" Oct 10 07:52:49 crc kubenswrapper[4822]: E1010 07:52:49.472588 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933\": container with ID starting with cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933 not found: ID does not exist" containerID="cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.472625 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933"} err="failed to get container status \"cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933\": rpc error: code = NotFound desc = could not find container \"cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933\": container with ID starting with cada9f94b0f8ca6b73e377af371cb40691cb8f2616e399e729c4d3c759cc4933 not found: ID does not exist" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.472652 4822 scope.go:117] "RemoveContainer" containerID="e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d" Oct 10 07:52:49 crc kubenswrapper[4822]: E1010 07:52:49.473030 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d\": container with ID starting with e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d not found: ID does not exist" containerID="e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.473057 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d"} err="failed to get container status \"e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d\": rpc error: code = NotFound desc = could not find container \"e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d\": container with ID starting with e3755f4401a66be534eb93dfd614982ceb3b8fb75ed776cca9c2ae562551f53d not found: ID does not exist" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.473075 4822 scope.go:117] "RemoveContainer" containerID="5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050" Oct 10 07:52:49 crc kubenswrapper[4822]: E1010 07:52:49.473339 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050\": container with ID starting with 5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050 not found: ID does not exist" containerID="5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.473369 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050"} err="failed to get container status \"5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050\": rpc error: code = NotFound desc = could not find container \"5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050\": container with ID starting with 5bb462a25c4890ac8969d0a5de00bef9c4bd99c19b1e2f0554f5724898fce050 not found: ID does not exist" Oct 10 07:52:49 crc kubenswrapper[4822]: I1010 07:52:49.659540 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" path="/var/lib/kubelet/pods/d8322950-a9bd-414d-bc10-b0f52fed9c11/volumes" Oct 10 07:52:57 crc kubenswrapper[4822]: I1010 07:52:57.658647 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:52:57 crc kubenswrapper[4822]: E1010 07:52:57.659914 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:53:12 crc kubenswrapper[4822]: I1010 07:53:12.650541 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:53:13 crc kubenswrapper[4822]: I1010 07:53:13.609334 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"6be1afa435827691537f13fe12c13c00ff6d1428cd681ee10e85bde839016aa2"} Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.766785 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-s92rw"] Oct 10 07:53:35 crc kubenswrapper[4822]: E1010 07:53:35.768278 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="extract-utilities" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.768311 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="extract-utilities" Oct 10 07:53:35 crc kubenswrapper[4822]: E1010 07:53:35.768337 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="registry-server" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.768357 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="registry-server" Oct 10 07:53:35 crc kubenswrapper[4822]: E1010 07:53:35.768385 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="extract-content" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.768403 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="extract-content" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.768871 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8322950-a9bd-414d-bc10-b0f52fed9c11" containerName="registry-server" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.770233 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.780139 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s92rw"] Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.884459 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdd4b\" (UniqueName: \"kubernetes.io/projected/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d-kube-api-access-gdd4b\") pod \"barbican-db-create-s92rw\" (UID: \"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d\") " pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:35 crc kubenswrapper[4822]: I1010 07:53:35.986696 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdd4b\" (UniqueName: \"kubernetes.io/projected/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d-kube-api-access-gdd4b\") pod \"barbican-db-create-s92rw\" (UID: \"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d\") " pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:36 crc kubenswrapper[4822]: I1010 07:53:36.006771 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdd4b\" (UniqueName: \"kubernetes.io/projected/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d-kube-api-access-gdd4b\") pod \"barbican-db-create-s92rw\" (UID: \"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d\") " pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:36 crc kubenswrapper[4822]: I1010 07:53:36.092326 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:36 crc kubenswrapper[4822]: I1010 07:53:36.483480 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s92rw"] Oct 10 07:53:36 crc kubenswrapper[4822]: I1010 07:53:36.821489 4822 generic.go:334] "Generic (PLEG): container finished" podID="d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d" containerID="61487c853effe2ba9a0a04584e4ca5faeea24ef7b9e0f73c32cf9a5d061f9830" exitCode=0 Oct 10 07:53:36 crc kubenswrapper[4822]: I1010 07:53:36.821548 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s92rw" event={"ID":"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d","Type":"ContainerDied","Data":"61487c853effe2ba9a0a04584e4ca5faeea24ef7b9e0f73c32cf9a5d061f9830"} Oct 10 07:53:36 crc kubenswrapper[4822]: I1010 07:53:36.821582 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s92rw" event={"ID":"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d","Type":"ContainerStarted","Data":"c572863cbbe447e6a17e150a415c8adcbcb3b2bc552d1a82946acec9b5925641"} Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.126938 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.228965 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdd4b\" (UniqueName: \"kubernetes.io/projected/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d-kube-api-access-gdd4b\") pod \"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d\" (UID: \"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d\") " Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.241629 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d-kube-api-access-gdd4b" (OuterVolumeSpecName: "kube-api-access-gdd4b") pod "d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d" (UID: "d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d"). InnerVolumeSpecName "kube-api-access-gdd4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.330758 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdd4b\" (UniqueName: \"kubernetes.io/projected/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d-kube-api-access-gdd4b\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.845340 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s92rw" event={"ID":"d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d","Type":"ContainerDied","Data":"c572863cbbe447e6a17e150a415c8adcbcb3b2bc552d1a82946acec9b5925641"} Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.845408 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c572863cbbe447e6a17e150a415c8adcbcb3b2bc552d1a82946acec9b5925641" Oct 10 07:53:38 crc kubenswrapper[4822]: I1010 07:53:38.845361 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s92rw" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.771313 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-54a5-account-create-prsl6"] Oct 10 07:53:45 crc kubenswrapper[4822]: E1010 07:53:45.774362 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d" containerName="mariadb-database-create" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.774381 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d" containerName="mariadb-database-create" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.774525 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d" containerName="mariadb-database-create" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.775108 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.776996 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.808932 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-54a5-account-create-prsl6"] Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.871077 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrkr\" (UniqueName: \"kubernetes.io/projected/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3-kube-api-access-jzrkr\") pod \"barbican-54a5-account-create-prsl6\" (UID: \"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3\") " pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.972838 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrkr\" (UniqueName: \"kubernetes.io/projected/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3-kube-api-access-jzrkr\") pod \"barbican-54a5-account-create-prsl6\" (UID: \"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3\") " pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:45 crc kubenswrapper[4822]: I1010 07:53:45.994202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrkr\" (UniqueName: \"kubernetes.io/projected/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3-kube-api-access-jzrkr\") pod \"barbican-54a5-account-create-prsl6\" (UID: \"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3\") " pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:46 crc kubenswrapper[4822]: I1010 07:53:46.103158 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:46 crc kubenswrapper[4822]: I1010 07:53:46.340783 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-54a5-account-create-prsl6"] Oct 10 07:53:46 crc kubenswrapper[4822]: I1010 07:53:46.925182 4822 generic.go:334] "Generic (PLEG): container finished" podID="1302fbd4-38e4-4317-b2ae-ea5b2feec5e3" containerID="30a4c47e78346f33907bd0aa49a07c5527d8e4127d8bcc91d22da652ed6ce13d" exitCode=0 Oct 10 07:53:46 crc kubenswrapper[4822]: I1010 07:53:46.925666 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-54a5-account-create-prsl6" event={"ID":"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3","Type":"ContainerDied","Data":"30a4c47e78346f33907bd0aa49a07c5527d8e4127d8bcc91d22da652ed6ce13d"} Oct 10 07:53:46 crc kubenswrapper[4822]: I1010 07:53:46.925714 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-54a5-account-create-prsl6" event={"ID":"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3","Type":"ContainerStarted","Data":"4c5ccd7ef847da6365d860b7a84e9c5bbb8d8a1e3aaf26ecdb78495b4cd122e0"} Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.376101 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.517276 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzrkr\" (UniqueName: \"kubernetes.io/projected/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3-kube-api-access-jzrkr\") pod \"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3\" (UID: \"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3\") " Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.523586 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3-kube-api-access-jzrkr" (OuterVolumeSpecName: "kube-api-access-jzrkr") pod "1302fbd4-38e4-4317-b2ae-ea5b2feec5e3" (UID: "1302fbd4-38e4-4317-b2ae-ea5b2feec5e3"). InnerVolumeSpecName "kube-api-access-jzrkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.619480 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzrkr\" (UniqueName: \"kubernetes.io/projected/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3-kube-api-access-jzrkr\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.946636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-54a5-account-create-prsl6" event={"ID":"1302fbd4-38e4-4317-b2ae-ea5b2feec5e3","Type":"ContainerDied","Data":"4c5ccd7ef847da6365d860b7a84e9c5bbb8d8a1e3aaf26ecdb78495b4cd122e0"} Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.946934 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c5ccd7ef847da6365d860b7a84e9c5bbb8d8a1e3aaf26ecdb78495b4cd122e0" Oct 10 07:53:48 crc kubenswrapper[4822]: I1010 07:53:48.946768 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-54a5-account-create-prsl6" Oct 10 07:53:50 crc kubenswrapper[4822]: I1010 07:53:50.990068 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-vhdnd"] Oct 10 07:53:50 crc kubenswrapper[4822]: E1010 07:53:50.990699 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1302fbd4-38e4-4317-b2ae-ea5b2feec5e3" containerName="mariadb-account-create" Oct 10 07:53:50 crc kubenswrapper[4822]: I1010 07:53:50.990711 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1302fbd4-38e4-4317-b2ae-ea5b2feec5e3" containerName="mariadb-account-create" Oct 10 07:53:50 crc kubenswrapper[4822]: I1010 07:53:50.990903 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1302fbd4-38e4-4317-b2ae-ea5b2feec5e3" containerName="mariadb-account-create" Oct 10 07:53:50 crc kubenswrapper[4822]: I1010 07:53:50.991477 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:50 crc kubenswrapper[4822]: I1010 07:53:50.994188 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-llwgd" Oct 10 07:53:50 crc kubenswrapper[4822]: I1010 07:53:50.994907 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.007846 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vhdnd"] Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.165576 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cffql\" (UniqueName: \"kubernetes.io/projected/6d93a63b-47d4-4c9f-8670-e22defaaed84-kube-api-access-cffql\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.165871 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-combined-ca-bundle\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.166042 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-db-sync-config-data\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.268311 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-combined-ca-bundle\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.268450 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-db-sync-config-data\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.268552 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cffql\" (UniqueName: \"kubernetes.io/projected/6d93a63b-47d4-4c9f-8670-e22defaaed84-kube-api-access-cffql\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.274495 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-db-sync-config-data\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.275285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-combined-ca-bundle\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.308905 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cffql\" (UniqueName: \"kubernetes.io/projected/6d93a63b-47d4-4c9f-8670-e22defaaed84-kube-api-access-cffql\") pod \"barbican-db-sync-vhdnd\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.311220 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.831188 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vhdnd"] Oct 10 07:53:51 crc kubenswrapper[4822]: I1010 07:53:51.987881 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhdnd" event={"ID":"6d93a63b-47d4-4c9f-8670-e22defaaed84","Type":"ContainerStarted","Data":"d5901f873a9fbf9074cfff53e4f2edb9b8b0c6e8550129ec379dc004cc5d840d"} Oct 10 07:53:53 crc kubenswrapper[4822]: I1010 07:53:53.001109 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhdnd" event={"ID":"6d93a63b-47d4-4c9f-8670-e22defaaed84","Type":"ContainerStarted","Data":"bd500310655ee0a22d68385992a9fbe711328b89a4c4faf84c54a65b6a45f8c2"} Oct 10 07:53:53 crc kubenswrapper[4822]: I1010 07:53:53.029770 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-vhdnd" podStartSLOduration=3.029735267 podStartE2EDuration="3.029735267s" podCreationTimestamp="2025-10-10 07:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:53:53.026100343 +0000 UTC m=+5380.121258539" watchObservedRunningTime="2025-10-10 07:53:53.029735267 +0000 UTC m=+5380.124893513" Oct 10 07:53:54 crc kubenswrapper[4822]: I1010 07:53:54.017108 4822 generic.go:334] "Generic (PLEG): container finished" podID="6d93a63b-47d4-4c9f-8670-e22defaaed84" containerID="bd500310655ee0a22d68385992a9fbe711328b89a4c4faf84c54a65b6a45f8c2" exitCode=0 Oct 10 07:53:54 crc kubenswrapper[4822]: I1010 07:53:54.017208 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhdnd" event={"ID":"6d93a63b-47d4-4c9f-8670-e22defaaed84","Type":"ContainerDied","Data":"bd500310655ee0a22d68385992a9fbe711328b89a4c4faf84c54a65b6a45f8c2"} Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.371300 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.442586 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cffql\" (UniqueName: \"kubernetes.io/projected/6d93a63b-47d4-4c9f-8670-e22defaaed84-kube-api-access-cffql\") pod \"6d93a63b-47d4-4c9f-8670-e22defaaed84\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.442756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-db-sync-config-data\") pod \"6d93a63b-47d4-4c9f-8670-e22defaaed84\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.442837 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-combined-ca-bundle\") pod \"6d93a63b-47d4-4c9f-8670-e22defaaed84\" (UID: \"6d93a63b-47d4-4c9f-8670-e22defaaed84\") " Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.450828 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d93a63b-47d4-4c9f-8670-e22defaaed84-kube-api-access-cffql" (OuterVolumeSpecName: "kube-api-access-cffql") pod "6d93a63b-47d4-4c9f-8670-e22defaaed84" (UID: "6d93a63b-47d4-4c9f-8670-e22defaaed84"). InnerVolumeSpecName "kube-api-access-cffql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.451164 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d93a63b-47d4-4c9f-8670-e22defaaed84" (UID: "6d93a63b-47d4-4c9f-8670-e22defaaed84"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.473957 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d93a63b-47d4-4c9f-8670-e22defaaed84" (UID: "6d93a63b-47d4-4c9f-8670-e22defaaed84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.547862 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cffql\" (UniqueName: \"kubernetes.io/projected/6d93a63b-47d4-4c9f-8670-e22defaaed84-kube-api-access-cffql\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.547927 4822 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:55 crc kubenswrapper[4822]: I1010 07:53:55.547940 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d93a63b-47d4-4c9f-8670-e22defaaed84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.043691 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhdnd" event={"ID":"6d93a63b-47d4-4c9f-8670-e22defaaed84","Type":"ContainerDied","Data":"d5901f873a9fbf9074cfff53e4f2edb9b8b0c6e8550129ec379dc004cc5d840d"} Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.044073 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5901f873a9fbf9074cfff53e4f2edb9b8b0c6e8550129ec379dc004cc5d840d" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.043768 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhdnd" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.262774 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7ff764c9-vhb8b"] Oct 10 07:53:56 crc kubenswrapper[4822]: E1010 07:53:56.263219 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d93a63b-47d4-4c9f-8670-e22defaaed84" containerName="barbican-db-sync" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.263242 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d93a63b-47d4-4c9f-8670-e22defaaed84" containerName="barbican-db-sync" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.263454 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d93a63b-47d4-4c9f-8670-e22defaaed84" containerName="barbican-db-sync" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.264518 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.267742 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.270726 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-llwgd" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.271135 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.285556 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-85d4945d7b-2j8vx"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.287647 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.290048 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.308694 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7ff764c9-vhb8b"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.326209 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-85d4945d7b-2j8vx"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360725 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-combined-ca-bundle\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360775 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4wmn\" (UniqueName: \"kubernetes.io/projected/08f2317b-357b-4060-92f2-13ed8d69c226-kube-api-access-f4wmn\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360860 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08f2317b-357b-4060-92f2-13ed8d69c226-logs\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360879 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-config-data-custom\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360901 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-config-data\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360923 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba15f45-5531-4b65-bbfc-55a051cda9a7-logs\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360937 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5cl\" (UniqueName: \"kubernetes.io/projected/9ba15f45-5531-4b65-bbfc-55a051cda9a7-kube-api-access-hn5cl\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360955 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-config-data\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.360972 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-config-data-custom\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.361016 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-combined-ca-bundle\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.369974 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-776fc78955-d9vz7"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.371378 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.395848 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-776fc78955-d9vz7"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.408902 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8db6f474-rp2qt"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.410241 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.414065 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.423228 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8db6f474-rp2qt"] Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463303 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4wmn\" (UniqueName: \"kubernetes.io/projected/08f2317b-357b-4060-92f2-13ed8d69c226-kube-api-access-f4wmn\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463357 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-sb\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463392 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-dns-svc\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463410 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-config\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463432 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9npz\" (UniqueName: \"kubernetes.io/projected/7a1514d1-e25d-43fb-9913-a58adb856da4-kube-api-access-j9npz\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463472 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08f2317b-357b-4060-92f2-13ed8d69c226-logs\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463491 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-config-data-custom\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463511 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-config-data\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463532 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba15f45-5531-4b65-bbfc-55a051cda9a7-logs\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463549 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn5cl\" (UniqueName: \"kubernetes.io/projected/9ba15f45-5531-4b65-bbfc-55a051cda9a7-kube-api-access-hn5cl\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463572 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-config-data-custom\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463588 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-config-data\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463624 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-nb\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463648 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-combined-ca-bundle\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.463681 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-combined-ca-bundle\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.465138 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba15f45-5531-4b65-bbfc-55a051cda9a7-logs\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.466015 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08f2317b-357b-4060-92f2-13ed8d69c226-logs\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.469647 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-config-data-custom\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.470421 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-config-data\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.473557 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-combined-ca-bundle\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.474402 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-config-data-custom\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.478561 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba15f45-5531-4b65-bbfc-55a051cda9a7-config-data\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.485464 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f2317b-357b-4060-92f2-13ed8d69c226-combined-ca-bundle\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.491312 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn5cl\" (UniqueName: \"kubernetes.io/projected/9ba15f45-5531-4b65-bbfc-55a051cda9a7-kube-api-access-hn5cl\") pod \"barbican-keystone-listener-85d4945d7b-2j8vx\" (UID: \"9ba15f45-5531-4b65-bbfc-55a051cda9a7\") " pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.494681 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4wmn\" (UniqueName: \"kubernetes.io/projected/08f2317b-357b-4060-92f2-13ed8d69c226-kube-api-access-f4wmn\") pod \"barbican-worker-7ff764c9-vhb8b\" (UID: \"08f2317b-357b-4060-92f2-13ed8d69c226\") " pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.564945 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-dns-svc\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.564987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-config\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565016 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9npz\" (UniqueName: \"kubernetes.io/projected/7a1514d1-e25d-43fb-9913-a58adb856da4-kube-api-access-j9npz\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565052 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-config-data\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565101 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-combined-ca-bundle\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565130 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a4e919a-211c-401c-a620-ad1ca22ce280-logs\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565147 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-nb\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565203 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-config-data-custom\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565229 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-sb\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565245 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glbm\" (UniqueName: \"kubernetes.io/projected/2a4e919a-211c-401c-a620-ad1ca22ce280-kube-api-access-4glbm\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.565922 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-config\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.566007 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-dns-svc\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.566618 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-nb\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.566722 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-sb\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.584985 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9npz\" (UniqueName: \"kubernetes.io/projected/7a1514d1-e25d-43fb-9913-a58adb856da4-kube-api-access-j9npz\") pod \"dnsmasq-dns-776fc78955-d9vz7\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.597930 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ff764c9-vhb8b" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.624302 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.667182 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a4e919a-211c-401c-a620-ad1ca22ce280-logs\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.667590 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-config-data-custom\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.667617 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4glbm\" (UniqueName: \"kubernetes.io/projected/2a4e919a-211c-401c-a620-ad1ca22ce280-kube-api-access-4glbm\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.667659 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-config-data\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.667721 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-combined-ca-bundle\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.668261 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a4e919a-211c-401c-a620-ad1ca22ce280-logs\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.671830 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-config-data-custom\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.672341 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-combined-ca-bundle\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.674170 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4e919a-211c-401c-a620-ad1ca22ce280-config-data\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.684267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glbm\" (UniqueName: \"kubernetes.io/projected/2a4e919a-211c-401c-a620-ad1ca22ce280-kube-api-access-4glbm\") pod \"barbican-api-8db6f474-rp2qt\" (UID: \"2a4e919a-211c-401c-a620-ad1ca22ce280\") " pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.698386 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:53:56 crc kubenswrapper[4822]: I1010 07:53:56.729446 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:57 crc kubenswrapper[4822]: I1010 07:53:57.089441 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7ff764c9-vhb8b"] Oct 10 07:53:57 crc kubenswrapper[4822]: W1010 07:53:57.093237 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f2317b_357b_4060_92f2_13ed8d69c226.slice/crio-08d67fe5256af5a6f1af0c816df4db530468ecdc419459d5416b880589c03bc0 WatchSource:0}: Error finding container 08d67fe5256af5a6f1af0c816df4db530468ecdc419459d5416b880589c03bc0: Status 404 returned error can't find the container with id 08d67fe5256af5a6f1af0c816df4db530468ecdc419459d5416b880589c03bc0 Oct 10 07:53:57 crc kubenswrapper[4822]: W1010 07:53:57.152586 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ba15f45_5531_4b65_bbfc_55a051cda9a7.slice/crio-db5511f5985d1b0e0ed42624d20a0da57739cca3ea2b3d8290ed5418f3301dfb WatchSource:0}: Error finding container db5511f5985d1b0e0ed42624d20a0da57739cca3ea2b3d8290ed5418f3301dfb: Status 404 returned error can't find the container with id db5511f5985d1b0e0ed42624d20a0da57739cca3ea2b3d8290ed5418f3301dfb Oct 10 07:53:57 crc kubenswrapper[4822]: I1010 07:53:57.159485 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-85d4945d7b-2j8vx"] Oct 10 07:53:57 crc kubenswrapper[4822]: I1010 07:53:57.220318 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8db6f474-rp2qt"] Oct 10 07:53:57 crc kubenswrapper[4822]: I1010 07:53:57.237195 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-776fc78955-d9vz7"] Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.059448 4822 generic.go:334] "Generic (PLEG): container finished" podID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerID="63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a" exitCode=0 Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.059519 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" event={"ID":"7a1514d1-e25d-43fb-9913-a58adb856da4","Type":"ContainerDied","Data":"63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069713 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" event={"ID":"7a1514d1-e25d-43fb-9913-a58adb856da4","Type":"ContainerStarted","Data":"f0142c68240d04bc385561e400cdacb2f32efb16e2e436245a8d336e91f259ad"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069769 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069787 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069808 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8db6f474-rp2qt" event={"ID":"2a4e919a-211c-401c-a620-ad1ca22ce280","Type":"ContainerStarted","Data":"6734f2ea64ff1ac3d7535e39e4bd6008e354e444f119cacf3d37a90217f28d27"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069818 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8db6f474-rp2qt" event={"ID":"2a4e919a-211c-401c-a620-ad1ca22ce280","Type":"ContainerStarted","Data":"e0d3437306b2c76306c42b40a5837cb164b5f00d50020468e1d6058e312498ba"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069827 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8db6f474-rp2qt" event={"ID":"2a4e919a-211c-401c-a620-ad1ca22ce280","Type":"ContainerStarted","Data":"9112099f06c576f73735b84971933aff8810c926f0b1c311e25a8159fc0154d8"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069839 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" event={"ID":"9ba15f45-5531-4b65-bbfc-55a051cda9a7","Type":"ContainerStarted","Data":"97d26227cd1d8ac4cc08438042cedd2f807201629aa3429e02a771a7761744de"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069850 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" event={"ID":"9ba15f45-5531-4b65-bbfc-55a051cda9a7","Type":"ContainerStarted","Data":"0b071479788363a370276718628ebea25458fda24fe9a5c46a419646b221e8ed"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.069859 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" event={"ID":"9ba15f45-5531-4b65-bbfc-55a051cda9a7","Type":"ContainerStarted","Data":"db5511f5985d1b0e0ed42624d20a0da57739cca3ea2b3d8290ed5418f3301dfb"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.083972 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ff764c9-vhb8b" event={"ID":"08f2317b-357b-4060-92f2-13ed8d69c226","Type":"ContainerStarted","Data":"3f2721b9ec0e2bfb3da1807ff5ccbcebf0f18f58817ed90169b23eed8f03dd46"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.084023 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ff764c9-vhb8b" event={"ID":"08f2317b-357b-4060-92f2-13ed8d69c226","Type":"ContainerStarted","Data":"afcbb9b6a4008420cf56ae25106ef9b649bce141dbde6cd868c343dc3f43648f"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.084036 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ff764c9-vhb8b" event={"ID":"08f2317b-357b-4060-92f2-13ed8d69c226","Type":"ContainerStarted","Data":"08d67fe5256af5a6f1af0c816df4db530468ecdc419459d5416b880589c03bc0"} Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.137241 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7ff764c9-vhb8b" podStartSLOduration=2.1372174250000002 podStartE2EDuration="2.137217425s" podCreationTimestamp="2025-10-10 07:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:53:58.11347124 +0000 UTC m=+5385.208629446" watchObservedRunningTime="2025-10-10 07:53:58.137217425 +0000 UTC m=+5385.232375621" Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.149002 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-85d4945d7b-2j8vx" podStartSLOduration=2.148980554 podStartE2EDuration="2.148980554s" podCreationTimestamp="2025-10-10 07:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:53:58.145827313 +0000 UTC m=+5385.240985519" watchObservedRunningTime="2025-10-10 07:53:58.148980554 +0000 UTC m=+5385.244138750" Oct 10 07:53:58 crc kubenswrapper[4822]: I1010 07:53:58.169374 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8db6f474-rp2qt" podStartSLOduration=2.169356911 podStartE2EDuration="2.169356911s" podCreationTimestamp="2025-10-10 07:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:53:58.167540649 +0000 UTC m=+5385.262698855" watchObservedRunningTime="2025-10-10 07:53:58.169356911 +0000 UTC m=+5385.264515107" Oct 10 07:53:59 crc kubenswrapper[4822]: I1010 07:53:59.095445 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" event={"ID":"7a1514d1-e25d-43fb-9913-a58adb856da4","Type":"ContainerStarted","Data":"f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34"} Oct 10 07:53:59 crc kubenswrapper[4822]: I1010 07:53:59.124476 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" podStartSLOduration=3.124447828 podStartE2EDuration="3.124447828s" podCreationTimestamp="2025-10-10 07:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:53:59.120184736 +0000 UTC m=+5386.215342972" watchObservedRunningTime="2025-10-10 07:53:59.124447828 +0000 UTC m=+5386.219606064" Oct 10 07:54:00 crc kubenswrapper[4822]: I1010 07:54:00.105888 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:54:06 crc kubenswrapper[4822]: I1010 07:54:06.701169 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:54:06 crc kubenswrapper[4822]: I1010 07:54:06.784889 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94689f579-tr7jd"] Oct 10 07:54:06 crc kubenswrapper[4822]: I1010 07:54:06.785148 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94689f579-tr7jd" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerName="dnsmasq-dns" containerID="cri-o://154f29b6bd2e8bac3126f853023f37a59da76a70bbe4bf16e546b5fe79f434f4" gracePeriod=10 Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.193331 4822 generic.go:334] "Generic (PLEG): container finished" podID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerID="154f29b6bd2e8bac3126f853023f37a59da76a70bbe4bf16e546b5fe79f434f4" exitCode=0 Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.193631 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94689f579-tr7jd" event={"ID":"42ddc26f-23b6-4a5f-838e-5dd6cc566fce","Type":"ContainerDied","Data":"154f29b6bd2e8bac3126f853023f37a59da76a70bbe4bf16e546b5fe79f434f4"} Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.299681 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.382845 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-config\") pod \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.382902 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-nb\") pod \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.383017 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwbzm\" (UniqueName: \"kubernetes.io/projected/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-kube-api-access-wwbzm\") pod \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.383146 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-dns-svc\") pod \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.383191 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-sb\") pod \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\" (UID: \"42ddc26f-23b6-4a5f-838e-5dd6cc566fce\") " Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.399751 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-kube-api-access-wwbzm" (OuterVolumeSpecName: "kube-api-access-wwbzm") pod "42ddc26f-23b6-4a5f-838e-5dd6cc566fce" (UID: "42ddc26f-23b6-4a5f-838e-5dd6cc566fce"). InnerVolumeSpecName "kube-api-access-wwbzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.427022 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42ddc26f-23b6-4a5f-838e-5dd6cc566fce" (UID: "42ddc26f-23b6-4a5f-838e-5dd6cc566fce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.427869 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42ddc26f-23b6-4a5f-838e-5dd6cc566fce" (UID: "42ddc26f-23b6-4a5f-838e-5dd6cc566fce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.428100 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-config" (OuterVolumeSpecName: "config") pod "42ddc26f-23b6-4a5f-838e-5dd6cc566fce" (UID: "42ddc26f-23b6-4a5f-838e-5dd6cc566fce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.441586 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42ddc26f-23b6-4a5f-838e-5dd6cc566fce" (UID: "42ddc26f-23b6-4a5f-838e-5dd6cc566fce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.485386 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.485434 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.485448 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.485459 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:07 crc kubenswrapper[4822]: I1010 07:54:07.485472 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwbzm\" (UniqueName: \"kubernetes.io/projected/42ddc26f-23b6-4a5f-838e-5dd6cc566fce-kube-api-access-wwbzm\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.116262 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.206673 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94689f579-tr7jd" event={"ID":"42ddc26f-23b6-4a5f-838e-5dd6cc566fce","Type":"ContainerDied","Data":"e7ad8674fa78b014d1831215415cb65eb54401e8122df6b2b4c56555e23a32d8"} Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.206743 4822 scope.go:117] "RemoveContainer" containerID="154f29b6bd2e8bac3126f853023f37a59da76a70bbe4bf16e546b5fe79f434f4" Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.206935 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94689f579-tr7jd" Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.229491 4822 scope.go:117] "RemoveContainer" containerID="fb9776a75b26d84fbe2b85493241ee160c3a16625b6a7b4d4da9ac9b3d8b8b3d" Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.240866 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94689f579-tr7jd"] Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.251659 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8db6f474-rp2qt" Oct 10 07:54:08 crc kubenswrapper[4822]: I1010 07:54:08.254466 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94689f579-tr7jd"] Oct 10 07:54:09 crc kubenswrapper[4822]: I1010 07:54:09.662840 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" path="/var/lib/kubelet/pods/42ddc26f-23b6-4a5f-838e-5dd6cc566fce/volumes" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.063464 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qr5vj"] Oct 10 07:54:22 crc kubenswrapper[4822]: E1010 07:54:22.064454 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerName="dnsmasq-dns" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.064475 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerName="dnsmasq-dns" Oct 10 07:54:22 crc kubenswrapper[4822]: E1010 07:54:22.064503 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerName="init" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.064509 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerName="init" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.064683 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ddc26f-23b6-4a5f-838e-5dd6cc566fce" containerName="dnsmasq-dns" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.065279 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.073305 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qr5vj"] Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.187100 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zph\" (UniqueName: \"kubernetes.io/projected/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3-kube-api-access-v7zph\") pod \"neutron-db-create-qr5vj\" (UID: \"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3\") " pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.289507 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zph\" (UniqueName: \"kubernetes.io/projected/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3-kube-api-access-v7zph\") pod \"neutron-db-create-qr5vj\" (UID: \"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3\") " pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.308552 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zph\" (UniqueName: \"kubernetes.io/projected/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3-kube-api-access-v7zph\") pod \"neutron-db-create-qr5vj\" (UID: \"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3\") " pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.388385 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:22 crc kubenswrapper[4822]: I1010 07:54:22.835523 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qr5vj"] Oct 10 07:54:22 crc kubenswrapper[4822]: W1010 07:54:22.844745 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4d470bc_cf67_4ef1_aabe_e5b175f6b1b3.slice/crio-541380de698685e1cf4a8997a041742e985f64145795e722e5fb076eac3e6fe9 WatchSource:0}: Error finding container 541380de698685e1cf4a8997a041742e985f64145795e722e5fb076eac3e6fe9: Status 404 returned error can't find the container with id 541380de698685e1cf4a8997a041742e985f64145795e722e5fb076eac3e6fe9 Oct 10 07:54:23 crc kubenswrapper[4822]: I1010 07:54:23.358341 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3" containerID="6457dd756c10613695cd6ba6d642b53fc5f5ed98b68e3167c9ca43fb6a36668d" exitCode=0 Oct 10 07:54:23 crc kubenswrapper[4822]: I1010 07:54:23.358461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr5vj" event={"ID":"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3","Type":"ContainerDied","Data":"6457dd756c10613695cd6ba6d642b53fc5f5ed98b68e3167c9ca43fb6a36668d"} Oct 10 07:54:23 crc kubenswrapper[4822]: I1010 07:54:23.358719 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr5vj" event={"ID":"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3","Type":"ContainerStarted","Data":"541380de698685e1cf4a8997a041742e985f64145795e722e5fb076eac3e6fe9"} Oct 10 07:54:24 crc kubenswrapper[4822]: I1010 07:54:24.826654 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:24 crc kubenswrapper[4822]: I1010 07:54:24.852649 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7zph\" (UniqueName: \"kubernetes.io/projected/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3-kube-api-access-v7zph\") pod \"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3\" (UID: \"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3\") " Oct 10 07:54:24 crc kubenswrapper[4822]: I1010 07:54:24.859391 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3-kube-api-access-v7zph" (OuterVolumeSpecName: "kube-api-access-v7zph") pod "f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3" (UID: "f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3"). InnerVolumeSpecName "kube-api-access-v7zph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:54:24 crc kubenswrapper[4822]: I1010 07:54:24.954617 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7zph\" (UniqueName: \"kubernetes.io/projected/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3-kube-api-access-v7zph\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:25 crc kubenswrapper[4822]: I1010 07:54:25.383653 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr5vj" event={"ID":"f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3","Type":"ContainerDied","Data":"541380de698685e1cf4a8997a041742e985f64145795e722e5fb076eac3e6fe9"} Oct 10 07:54:25 crc kubenswrapper[4822]: I1010 07:54:25.383714 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr5vj" Oct 10 07:54:25 crc kubenswrapper[4822]: I1010 07:54:25.383718 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="541380de698685e1cf4a8997a041742e985f64145795e722e5fb076eac3e6fe9" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.209537 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1632-account-create-kbgz7"] Oct 10 07:54:32 crc kubenswrapper[4822]: E1010 07:54:32.210726 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3" containerName="mariadb-database-create" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.210750 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3" containerName="mariadb-database-create" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.211133 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3" containerName="mariadb-database-create" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.212031 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.219854 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1632-account-create-kbgz7"] Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.226925 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.329758 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tdkj\" (UniqueName: \"kubernetes.io/projected/06d688cb-da87-41e9-9a40-71b73cd5e4ec-kube-api-access-2tdkj\") pod \"neutron-1632-account-create-kbgz7\" (UID: \"06d688cb-da87-41e9-9a40-71b73cd5e4ec\") " pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.431692 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tdkj\" (UniqueName: \"kubernetes.io/projected/06d688cb-da87-41e9-9a40-71b73cd5e4ec-kube-api-access-2tdkj\") pod \"neutron-1632-account-create-kbgz7\" (UID: \"06d688cb-da87-41e9-9a40-71b73cd5e4ec\") " pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.452909 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tdkj\" (UniqueName: \"kubernetes.io/projected/06d688cb-da87-41e9-9a40-71b73cd5e4ec-kube-api-access-2tdkj\") pod \"neutron-1632-account-create-kbgz7\" (UID: \"06d688cb-da87-41e9-9a40-71b73cd5e4ec\") " pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:32 crc kubenswrapper[4822]: I1010 07:54:32.542493 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:33 crc kubenswrapper[4822]: I1010 07:54:33.094306 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1632-account-create-kbgz7"] Oct 10 07:54:33 crc kubenswrapper[4822]: I1010 07:54:33.508068 4822 generic.go:334] "Generic (PLEG): container finished" podID="06d688cb-da87-41e9-9a40-71b73cd5e4ec" containerID="a1edfca0e37befb38c753a6c7e4e1dbdcf923f092e54ecec843d06905f96196a" exitCode=0 Oct 10 07:54:33 crc kubenswrapper[4822]: I1010 07:54:33.508211 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1632-account-create-kbgz7" event={"ID":"06d688cb-da87-41e9-9a40-71b73cd5e4ec","Type":"ContainerDied","Data":"a1edfca0e37befb38c753a6c7e4e1dbdcf923f092e54ecec843d06905f96196a"} Oct 10 07:54:33 crc kubenswrapper[4822]: I1010 07:54:33.508919 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1632-account-create-kbgz7" event={"ID":"06d688cb-da87-41e9-9a40-71b73cd5e4ec","Type":"ContainerStarted","Data":"73604b806e49725bb97224f766dc21915ec5b6f4c84cd285110147352190969e"} Oct 10 07:54:34 crc kubenswrapper[4822]: I1010 07:54:34.937622 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:34 crc kubenswrapper[4822]: I1010 07:54:34.979617 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tdkj\" (UniqueName: \"kubernetes.io/projected/06d688cb-da87-41e9-9a40-71b73cd5e4ec-kube-api-access-2tdkj\") pod \"06d688cb-da87-41e9-9a40-71b73cd5e4ec\" (UID: \"06d688cb-da87-41e9-9a40-71b73cd5e4ec\") " Oct 10 07:54:34 crc kubenswrapper[4822]: I1010 07:54:34.992306 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d688cb-da87-41e9-9a40-71b73cd5e4ec-kube-api-access-2tdkj" (OuterVolumeSpecName: "kube-api-access-2tdkj") pod "06d688cb-da87-41e9-9a40-71b73cd5e4ec" (UID: "06d688cb-da87-41e9-9a40-71b73cd5e4ec"). InnerVolumeSpecName "kube-api-access-2tdkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:54:35 crc kubenswrapper[4822]: I1010 07:54:35.081797 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tdkj\" (UniqueName: \"kubernetes.io/projected/06d688cb-da87-41e9-9a40-71b73cd5e4ec-kube-api-access-2tdkj\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:35 crc kubenswrapper[4822]: I1010 07:54:35.534495 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1632-account-create-kbgz7" event={"ID":"06d688cb-da87-41e9-9a40-71b73cd5e4ec","Type":"ContainerDied","Data":"73604b806e49725bb97224f766dc21915ec5b6f4c84cd285110147352190969e"} Oct 10 07:54:35 crc kubenswrapper[4822]: I1010 07:54:35.534560 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73604b806e49725bb97224f766dc21915ec5b6f4c84cd285110147352190969e" Oct 10 07:54:35 crc kubenswrapper[4822]: I1010 07:54:35.534573 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1632-account-create-kbgz7" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.458663 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fklpm"] Oct 10 07:54:37 crc kubenswrapper[4822]: E1010 07:54:37.459728 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d688cb-da87-41e9-9a40-71b73cd5e4ec" containerName="mariadb-account-create" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.459755 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d688cb-da87-41e9-9a40-71b73cd5e4ec" containerName="mariadb-account-create" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.460123 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d688cb-da87-41e9-9a40-71b73cd5e4ec" containerName="mariadb-account-create" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.461127 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.463322 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gdw9m" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.463520 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.464557 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.475065 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fklpm"] Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.632840 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-combined-ca-bundle\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.633147 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdw6n\" (UniqueName: \"kubernetes.io/projected/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-kube-api-access-rdw6n\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.633258 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-config\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.736306 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-combined-ca-bundle\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.737555 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdw6n\" (UniqueName: \"kubernetes.io/projected/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-kube-api-access-rdw6n\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.737700 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-config\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.745954 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-combined-ca-bundle\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.752144 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-config\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.760601 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdw6n\" (UniqueName: \"kubernetes.io/projected/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-kube-api-access-rdw6n\") pod \"neutron-db-sync-fklpm\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:37 crc kubenswrapper[4822]: I1010 07:54:37.794316 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:38 crc kubenswrapper[4822]: I1010 07:54:38.014782 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fklpm"] Oct 10 07:54:38 crc kubenswrapper[4822]: W1010 07:54:38.043921 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfced2e9_1b14_4bc0_a83f_8c3e2c610a76.slice/crio-9499b2c20ee71bdfd5c7a2190869a65ac62cc84637c14bd2da1cc4a78c19038a WatchSource:0}: Error finding container 9499b2c20ee71bdfd5c7a2190869a65ac62cc84637c14bd2da1cc4a78c19038a: Status 404 returned error can't find the container with id 9499b2c20ee71bdfd5c7a2190869a65ac62cc84637c14bd2da1cc4a78c19038a Oct 10 07:54:38 crc kubenswrapper[4822]: I1010 07:54:38.569184 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fklpm" event={"ID":"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76","Type":"ContainerStarted","Data":"f08abf6ab3617a1144bb4d385b1f349f2c3b2188bc2a207accc57473d3f0c887"} Oct 10 07:54:38 crc kubenswrapper[4822]: I1010 07:54:38.569426 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fklpm" event={"ID":"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76","Type":"ContainerStarted","Data":"9499b2c20ee71bdfd5c7a2190869a65ac62cc84637c14bd2da1cc4a78c19038a"} Oct 10 07:54:38 crc kubenswrapper[4822]: I1010 07:54:38.590423 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fklpm" podStartSLOduration=1.59040545 podStartE2EDuration="1.59040545s" podCreationTimestamp="2025-10-10 07:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:54:38.584190881 +0000 UTC m=+5425.679349117" watchObservedRunningTime="2025-10-10 07:54:38.59040545 +0000 UTC m=+5425.685563646" Oct 10 07:54:42 crc kubenswrapper[4822]: I1010 07:54:42.612173 4822 generic.go:334] "Generic (PLEG): container finished" podID="bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" containerID="f08abf6ab3617a1144bb4d385b1f349f2c3b2188bc2a207accc57473d3f0c887" exitCode=0 Oct 10 07:54:42 crc kubenswrapper[4822]: I1010 07:54:42.612297 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fklpm" event={"ID":"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76","Type":"ContainerDied","Data":"f08abf6ab3617a1144bb4d385b1f349f2c3b2188bc2a207accc57473d3f0c887"} Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.015169 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.149206 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdw6n\" (UniqueName: \"kubernetes.io/projected/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-kube-api-access-rdw6n\") pod \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.149408 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-config\") pod \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.149493 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-combined-ca-bundle\") pod \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\" (UID: \"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76\") " Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.161371 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-kube-api-access-rdw6n" (OuterVolumeSpecName: "kube-api-access-rdw6n") pod "bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" (UID: "bfced2e9-1b14-4bc0-a83f-8c3e2c610a76"). InnerVolumeSpecName "kube-api-access-rdw6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.176420 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" (UID: "bfced2e9-1b14-4bc0-a83f-8c3e2c610a76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.180329 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-config" (OuterVolumeSpecName: "config") pod "bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" (UID: "bfced2e9-1b14-4bc0-a83f-8c3e2c610a76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.251781 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.251844 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.251864 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdw6n\" (UniqueName: \"kubernetes.io/projected/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76-kube-api-access-rdw6n\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.642651 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fklpm" event={"ID":"bfced2e9-1b14-4bc0-a83f-8c3e2c610a76","Type":"ContainerDied","Data":"9499b2c20ee71bdfd5c7a2190869a65ac62cc84637c14bd2da1cc4a78c19038a"} Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.642714 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9499b2c20ee71bdfd5c7a2190869a65ac62cc84637c14bd2da1cc4a78c19038a" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.642734 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fklpm" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.789421 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4"] Oct 10 07:54:44 crc kubenswrapper[4822]: E1010 07:54:44.790107 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" containerName="neutron-db-sync" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.790129 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" containerName="neutron-db-sync" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.790680 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" containerName="neutron-db-sync" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.793043 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.823826 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4"] Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.862292 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7686cd5d9-vc6db"] Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.863251 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mw4\" (UniqueName: \"kubernetes.io/projected/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-kube-api-access-54mw4\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.863328 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-dns-svc\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.863441 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-config\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.863469 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.863723 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.864087 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.869779 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.870167 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gdw9m" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.870685 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.874708 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7686cd5d9-vc6db"] Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965211 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-dns-svc\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965270 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvdl9\" (UniqueName: \"kubernetes.io/projected/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-kube-api-access-lvdl9\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965313 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-config\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965334 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965375 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-config\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965397 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-httpd-config\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965423 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-combined-ca-bundle\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965479 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.965530 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54mw4\" (UniqueName: \"kubernetes.io/projected/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-kube-api-access-54mw4\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.966924 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.966973 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.967039 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-config\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.967618 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-dns-svc\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:44 crc kubenswrapper[4822]: I1010 07:54:44.983559 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mw4\" (UniqueName: \"kubernetes.io/projected/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-kube-api-access-54mw4\") pod \"dnsmasq-dns-5cf9bd4d4f-6gcq4\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.066985 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvdl9\" (UniqueName: \"kubernetes.io/projected/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-kube-api-access-lvdl9\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.067056 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-config\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.067075 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-httpd-config\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.067097 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-combined-ca-bundle\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.070474 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-combined-ca-bundle\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.070769 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-config\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.080492 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-httpd-config\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.087513 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvdl9\" (UniqueName: \"kubernetes.io/projected/e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0-kube-api-access-lvdl9\") pod \"neutron-7686cd5d9-vc6db\" (UID: \"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0\") " pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.113358 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.181649 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.573166 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4"] Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.684010 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" event={"ID":"d0bbda27-59a2-41ef-9e12-c5c3abf679e1","Type":"ContainerStarted","Data":"3bec7b2fc39ea630dfd9ad9f6a135b3b78da2f33e70d80a02fbbe99a2711c556"} Oct 10 07:54:45 crc kubenswrapper[4822]: I1010 07:54:45.791113 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7686cd5d9-vc6db"] Oct 10 07:54:45 crc kubenswrapper[4822]: W1010 07:54:45.870458 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5e9e3c2_e7e6_45a2_a6eb_7f9c9d0939d0.slice/crio-9133b45c94ac7a8a6eb139eeab04759991d8146ee7509c27e488b72cb81783a6 WatchSource:0}: Error finding container 9133b45c94ac7a8a6eb139eeab04759991d8146ee7509c27e488b72cb81783a6: Status 404 returned error can't find the container with id 9133b45c94ac7a8a6eb139eeab04759991d8146ee7509c27e488b72cb81783a6 Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.694218 4822 generic.go:334] "Generic (PLEG): container finished" podID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerID="31d9f31ce2d76dafbbfc8072cdb4f7e5fe395bc919d7cc534ce32157ca4fc3f5" exitCode=0 Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.694353 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" event={"ID":"d0bbda27-59a2-41ef-9e12-c5c3abf679e1","Type":"ContainerDied","Data":"31d9f31ce2d76dafbbfc8072cdb4f7e5fe395bc919d7cc534ce32157ca4fc3f5"} Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.697945 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7686cd5d9-vc6db" event={"ID":"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0","Type":"ContainerStarted","Data":"59c22ba446783cbba7b38d890cd417ee012ee20a47dc48dae9558e779d17aa3b"} Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.698059 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7686cd5d9-vc6db" event={"ID":"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0","Type":"ContainerStarted","Data":"a252473410bb76cebe494537a9f8d0f2031b9df35608e05335142704b9d1857e"} Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.698080 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7686cd5d9-vc6db" event={"ID":"e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0","Type":"ContainerStarted","Data":"9133b45c94ac7a8a6eb139eeab04759991d8146ee7509c27e488b72cb81783a6"} Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.698253 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:54:46 crc kubenswrapper[4822]: I1010 07:54:46.754526 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7686cd5d9-vc6db" podStartSLOduration=2.754503085 podStartE2EDuration="2.754503085s" podCreationTimestamp="2025-10-10 07:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:54:46.746116713 +0000 UTC m=+5433.841274909" watchObservedRunningTime="2025-10-10 07:54:46.754503085 +0000 UTC m=+5433.849661281" Oct 10 07:54:47 crc kubenswrapper[4822]: I1010 07:54:47.706411 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" event={"ID":"d0bbda27-59a2-41ef-9e12-c5c3abf679e1","Type":"ContainerStarted","Data":"ec5517c3ca83f7e4c07fa36912539aace99baa2aa8f693a68e321c15619ec0a1"} Oct 10 07:54:47 crc kubenswrapper[4822]: I1010 07:54:47.726863 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" podStartSLOduration=3.72684676 podStartE2EDuration="3.72684676s" podCreationTimestamp="2025-10-10 07:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:54:47.72649628 +0000 UTC m=+5434.821654496" watchObservedRunningTime="2025-10-10 07:54:47.72684676 +0000 UTC m=+5434.822004956" Oct 10 07:54:48 crc kubenswrapper[4822]: I1010 07:54:48.713330 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.115962 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.181550 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-776fc78955-d9vz7"] Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.181857 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerName="dnsmasq-dns" containerID="cri-o://f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34" gracePeriod=10 Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.666333 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.760511 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-config\") pod \"7a1514d1-e25d-43fb-9913-a58adb856da4\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.760718 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-dns-svc\") pod \"7a1514d1-e25d-43fb-9913-a58adb856da4\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.760761 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9npz\" (UniqueName: \"kubernetes.io/projected/7a1514d1-e25d-43fb-9913-a58adb856da4-kube-api-access-j9npz\") pod \"7a1514d1-e25d-43fb-9913-a58adb856da4\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.760779 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-nb\") pod \"7a1514d1-e25d-43fb-9913-a58adb856da4\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.760794 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-sb\") pod \"7a1514d1-e25d-43fb-9913-a58adb856da4\" (UID: \"7a1514d1-e25d-43fb-9913-a58adb856da4\") " Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.766554 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1514d1-e25d-43fb-9913-a58adb856da4-kube-api-access-j9npz" (OuterVolumeSpecName: "kube-api-access-j9npz") pod "7a1514d1-e25d-43fb-9913-a58adb856da4" (UID: "7a1514d1-e25d-43fb-9913-a58adb856da4"). InnerVolumeSpecName "kube-api-access-j9npz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.799691 4822 generic.go:334] "Generic (PLEG): container finished" podID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerID="f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34" exitCode=0 Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.800058 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" event={"ID":"7a1514d1-e25d-43fb-9913-a58adb856da4","Type":"ContainerDied","Data":"f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34"} Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.800162 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" event={"ID":"7a1514d1-e25d-43fb-9913-a58adb856da4","Type":"ContainerDied","Data":"f0142c68240d04bc385561e400cdacb2f32efb16e2e436245a8d336e91f259ad"} Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.800239 4822 scope.go:117] "RemoveContainer" containerID="f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.800476 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-776fc78955-d9vz7" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.820733 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a1514d1-e25d-43fb-9913-a58adb856da4" (UID: "7a1514d1-e25d-43fb-9913-a58adb856da4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.826709 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-config" (OuterVolumeSpecName: "config") pod "7a1514d1-e25d-43fb-9913-a58adb856da4" (UID: "7a1514d1-e25d-43fb-9913-a58adb856da4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.826763 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a1514d1-e25d-43fb-9913-a58adb856da4" (UID: "7a1514d1-e25d-43fb-9913-a58adb856da4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.839870 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a1514d1-e25d-43fb-9913-a58adb856da4" (UID: "7a1514d1-e25d-43fb-9913-a58adb856da4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.863113 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.863146 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9npz\" (UniqueName: \"kubernetes.io/projected/7a1514d1-e25d-43fb-9913-a58adb856da4-kube-api-access-j9npz\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.863161 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.863171 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.863179 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1514d1-e25d-43fb-9913-a58adb856da4-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.871971 4822 scope.go:117] "RemoveContainer" containerID="63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.889598 4822 scope.go:117] "RemoveContainer" containerID="f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34" Oct 10 07:54:55 crc kubenswrapper[4822]: E1010 07:54:55.890053 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34\": container with ID starting with f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34 not found: ID does not exist" containerID="f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.890106 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34"} err="failed to get container status \"f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34\": rpc error: code = NotFound desc = could not find container \"f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34\": container with ID starting with f75633e46058350a19e835808333f09144a63fe56a95c1d99445113cfd39fd34 not found: ID does not exist" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.890142 4822 scope.go:117] "RemoveContainer" containerID="63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a" Oct 10 07:54:55 crc kubenswrapper[4822]: E1010 07:54:55.890507 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a\": container with ID starting with 63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a not found: ID does not exist" containerID="63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a" Oct 10 07:54:55 crc kubenswrapper[4822]: I1010 07:54:55.890600 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a"} err="failed to get container status \"63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a\": rpc error: code = NotFound desc = could not find container \"63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a\": container with ID starting with 63ed0952bf9e0f211d96285692d7e1d84ddccd741f98b345356b341bd79bdd8a not found: ID does not exist" Oct 10 07:54:56 crc kubenswrapper[4822]: I1010 07:54:56.152518 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-776fc78955-d9vz7"] Oct 10 07:54:56 crc kubenswrapper[4822]: I1010 07:54:56.162625 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-776fc78955-d9vz7"] Oct 10 07:54:57 crc kubenswrapper[4822]: I1010 07:54:57.669017 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" path="/var/lib/kubelet/pods/7a1514d1-e25d-43fb-9913-a58adb856da4/volumes" Oct 10 07:55:15 crc kubenswrapper[4822]: I1010 07:55:15.196443 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7686cd5d9-vc6db" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.155983 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6ztlk"] Oct 10 07:55:23 crc kubenswrapper[4822]: E1010 07:55:23.157016 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerName="dnsmasq-dns" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.157035 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerName="dnsmasq-dns" Oct 10 07:55:23 crc kubenswrapper[4822]: E1010 07:55:23.157082 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerName="init" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.157091 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerName="init" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.157315 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1514d1-e25d-43fb-9913-a58adb856da4" containerName="dnsmasq-dns" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.158097 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.163776 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6ztlk"] Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.265775 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcfdk\" (UniqueName: \"kubernetes.io/projected/8d1bc353-4be3-4cba-87ac-9cbee0c72e28-kube-api-access-bcfdk\") pod \"glance-db-create-6ztlk\" (UID: \"8d1bc353-4be3-4cba-87ac-9cbee0c72e28\") " pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.366597 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcfdk\" (UniqueName: \"kubernetes.io/projected/8d1bc353-4be3-4cba-87ac-9cbee0c72e28-kube-api-access-bcfdk\") pod \"glance-db-create-6ztlk\" (UID: \"8d1bc353-4be3-4cba-87ac-9cbee0c72e28\") " pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.392504 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcfdk\" (UniqueName: \"kubernetes.io/projected/8d1bc353-4be3-4cba-87ac-9cbee0c72e28-kube-api-access-bcfdk\") pod \"glance-db-create-6ztlk\" (UID: \"8d1bc353-4be3-4cba-87ac-9cbee0c72e28\") " pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.418076 4822 scope.go:117] "RemoveContainer" containerID="291bc0da022ba967a47d0b2c9ad816634c6acd21bdc586913388461604e80e95" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.515680 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:23 crc kubenswrapper[4822]: I1010 07:55:23.956633 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6ztlk"] Oct 10 07:55:24 crc kubenswrapper[4822]: I1010 07:55:24.072061 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6ztlk" event={"ID":"8d1bc353-4be3-4cba-87ac-9cbee0c72e28","Type":"ContainerStarted","Data":"97caf6fdb23ab19d26d312073be9a5c7632472ba034fa06f7dbe42f7316dfb2f"} Oct 10 07:55:25 crc kubenswrapper[4822]: I1010 07:55:25.081454 4822 generic.go:334] "Generic (PLEG): container finished" podID="8d1bc353-4be3-4cba-87ac-9cbee0c72e28" containerID="6df5236d20a2db9167c94b9bddbc2a92a1d378056540d84ac3a566ac807d7021" exitCode=0 Oct 10 07:55:25 crc kubenswrapper[4822]: I1010 07:55:25.081568 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6ztlk" event={"ID":"8d1bc353-4be3-4cba-87ac-9cbee0c72e28","Type":"ContainerDied","Data":"6df5236d20a2db9167c94b9bddbc2a92a1d378056540d84ac3a566ac807d7021"} Oct 10 07:55:26 crc kubenswrapper[4822]: I1010 07:55:26.423874 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:26 crc kubenswrapper[4822]: I1010 07:55:26.521557 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcfdk\" (UniqueName: \"kubernetes.io/projected/8d1bc353-4be3-4cba-87ac-9cbee0c72e28-kube-api-access-bcfdk\") pod \"8d1bc353-4be3-4cba-87ac-9cbee0c72e28\" (UID: \"8d1bc353-4be3-4cba-87ac-9cbee0c72e28\") " Oct 10 07:55:26 crc kubenswrapper[4822]: I1010 07:55:26.528192 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1bc353-4be3-4cba-87ac-9cbee0c72e28-kube-api-access-bcfdk" (OuterVolumeSpecName: "kube-api-access-bcfdk") pod "8d1bc353-4be3-4cba-87ac-9cbee0c72e28" (UID: "8d1bc353-4be3-4cba-87ac-9cbee0c72e28"). InnerVolumeSpecName "kube-api-access-bcfdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:26 crc kubenswrapper[4822]: I1010 07:55:26.622485 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcfdk\" (UniqueName: \"kubernetes.io/projected/8d1bc353-4be3-4cba-87ac-9cbee0c72e28-kube-api-access-bcfdk\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:27 crc kubenswrapper[4822]: I1010 07:55:27.111133 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6ztlk" event={"ID":"8d1bc353-4be3-4cba-87ac-9cbee0c72e28","Type":"ContainerDied","Data":"97caf6fdb23ab19d26d312073be9a5c7632472ba034fa06f7dbe42f7316dfb2f"} Oct 10 07:55:27 crc kubenswrapper[4822]: I1010 07:55:27.111170 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97caf6fdb23ab19d26d312073be9a5c7632472ba034fa06f7dbe42f7316dfb2f" Oct 10 07:55:27 crc kubenswrapper[4822]: I1010 07:55:27.111223 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6ztlk" Oct 10 07:55:31 crc kubenswrapper[4822]: I1010 07:55:31.337392 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:55:31 crc kubenswrapper[4822]: I1010 07:55:31.339942 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.307927 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cg4ng"] Oct 10 07:55:32 crc kubenswrapper[4822]: E1010 07:55:32.308490 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1bc353-4be3-4cba-87ac-9cbee0c72e28" containerName="mariadb-database-create" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.308511 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1bc353-4be3-4cba-87ac-9cbee0c72e28" containerName="mariadb-database-create" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.308902 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1bc353-4be3-4cba-87ac-9cbee0c72e28" containerName="mariadb-database-create" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.310950 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cg4ng"] Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.311089 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.432942 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-catalog-content\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.433076 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9z85\" (UniqueName: \"kubernetes.io/projected/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-kube-api-access-p9z85\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.433155 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-utilities\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.534500 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-catalog-content\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.534920 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9z85\" (UniqueName: \"kubernetes.io/projected/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-kube-api-access-p9z85\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.534997 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-utilities\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.535474 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-utilities\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.535541 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-catalog-content\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.562663 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9z85\" (UniqueName: \"kubernetes.io/projected/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-kube-api-access-p9z85\") pod \"redhat-operators-cg4ng\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:32 crc kubenswrapper[4822]: I1010 07:55:32.648734 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.108940 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cg4ng"] Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.183553 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerStarted","Data":"4c28b0d47176255dc9ce57fde53ca505a90a4772ff442702148cc12ff75cba6c"} Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.231179 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2435-account-create-dhd7j"] Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.232220 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.234272 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.245217 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2435-account-create-dhd7j"] Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.351580 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8288\" (UniqueName: \"kubernetes.io/projected/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc-kube-api-access-z8288\") pod \"glance-2435-account-create-dhd7j\" (UID: \"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc\") " pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.453822 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8288\" (UniqueName: \"kubernetes.io/projected/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc-kube-api-access-z8288\") pod \"glance-2435-account-create-dhd7j\" (UID: \"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc\") " pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.472486 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8288\" (UniqueName: \"kubernetes.io/projected/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc-kube-api-access-z8288\") pod \"glance-2435-account-create-dhd7j\" (UID: \"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc\") " pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:33 crc kubenswrapper[4822]: I1010 07:55:33.566194 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:34 crc kubenswrapper[4822]: I1010 07:55:34.027255 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2435-account-create-dhd7j"] Oct 10 07:55:34 crc kubenswrapper[4822]: I1010 07:55:34.194841 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-dhd7j" event={"ID":"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc","Type":"ContainerStarted","Data":"1b09bdd1f2523e5acd175953a5a703af614fb6a18fa211e43a7f1bbc72e36f68"} Oct 10 07:55:34 crc kubenswrapper[4822]: I1010 07:55:34.196936 4822 generic.go:334] "Generic (PLEG): container finished" podID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerID="506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9" exitCode=0 Oct 10 07:55:34 crc kubenswrapper[4822]: I1010 07:55:34.196976 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerDied","Data":"506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9"} Oct 10 07:55:35 crc kubenswrapper[4822]: I1010 07:55:35.204745 4822 generic.go:334] "Generic (PLEG): container finished" podID="6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc" containerID="849eda3a0cf6cc49ba047cee2794ae178512815607fdd52335a08afb0af633b3" exitCode=0 Oct 10 07:55:35 crc kubenswrapper[4822]: I1010 07:55:35.204790 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-dhd7j" event={"ID":"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc","Type":"ContainerDied","Data":"849eda3a0cf6cc49ba047cee2794ae178512815607fdd52335a08afb0af633b3"} Oct 10 07:55:35 crc kubenswrapper[4822]: I1010 07:55:35.208342 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerStarted","Data":"5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8"} Oct 10 07:55:36 crc kubenswrapper[4822]: I1010 07:55:36.222731 4822 generic.go:334] "Generic (PLEG): container finished" podID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerID="5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8" exitCode=0 Oct 10 07:55:36 crc kubenswrapper[4822]: I1010 07:55:36.222838 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerDied","Data":"5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8"} Oct 10 07:55:36 crc kubenswrapper[4822]: I1010 07:55:36.568281 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:36 crc kubenswrapper[4822]: I1010 07:55:36.713045 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8288\" (UniqueName: \"kubernetes.io/projected/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc-kube-api-access-z8288\") pod \"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc\" (UID: \"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc\") " Oct 10 07:55:36 crc kubenswrapper[4822]: I1010 07:55:36.719229 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc-kube-api-access-z8288" (OuterVolumeSpecName: "kube-api-access-z8288") pod "6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc" (UID: "6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc"). InnerVolumeSpecName "kube-api-access-z8288". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:36 crc kubenswrapper[4822]: I1010 07:55:36.814960 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8288\" (UniqueName: \"kubernetes.io/projected/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc-kube-api-access-z8288\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:37 crc kubenswrapper[4822]: I1010 07:55:37.240719 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-dhd7j" event={"ID":"6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc","Type":"ContainerDied","Data":"1b09bdd1f2523e5acd175953a5a703af614fb6a18fa211e43a7f1bbc72e36f68"} Oct 10 07:55:37 crc kubenswrapper[4822]: I1010 07:55:37.240787 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b09bdd1f2523e5acd175953a5a703af614fb6a18fa211e43a7f1bbc72e36f68" Oct 10 07:55:37 crc kubenswrapper[4822]: I1010 07:55:37.240937 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-dhd7j" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.261177 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerStarted","Data":"5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b"} Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.292717 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cg4ng" podStartSLOduration=2.83113216 podStartE2EDuration="6.292698383s" podCreationTimestamp="2025-10-10 07:55:32 +0000 UTC" firstStartedPulling="2025-10-10 07:55:34.200085497 +0000 UTC m=+5481.295243713" lastFinishedPulling="2025-10-10 07:55:37.66165173 +0000 UTC m=+5484.756809936" observedRunningTime="2025-10-10 07:55:38.291357875 +0000 UTC m=+5485.386516081" watchObservedRunningTime="2025-10-10 07:55:38.292698383 +0000 UTC m=+5485.387856589" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.309526 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-25sj8"] Oct 10 07:55:38 crc kubenswrapper[4822]: E1010 07:55:38.309964 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc" containerName="mariadb-account-create" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.309989 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc" containerName="mariadb-account-create" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.310222 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc" containerName="mariadb-account-create" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.311010 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.313135 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bc548" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.313329 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.322005 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-25sj8"] Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.445610 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4ng\" (UniqueName: \"kubernetes.io/projected/ae1da02d-0767-411e-bea7-b592b9ea37e1-kube-api-access-rb4ng\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.446068 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-combined-ca-bundle\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.446237 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-config-data\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.446503 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-db-sync-config-data\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.547832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4ng\" (UniqueName: \"kubernetes.io/projected/ae1da02d-0767-411e-bea7-b592b9ea37e1-kube-api-access-rb4ng\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.547933 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-combined-ca-bundle\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.547971 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-config-data\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.548121 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-db-sync-config-data\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.553470 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-combined-ca-bundle\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.556082 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-db-sync-config-data\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.565563 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-config-data\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.567969 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4ng\" (UniqueName: \"kubernetes.io/projected/ae1da02d-0767-411e-bea7-b592b9ea37e1-kube-api-access-rb4ng\") pod \"glance-db-sync-25sj8\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:38 crc kubenswrapper[4822]: I1010 07:55:38.661994 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:39 crc kubenswrapper[4822]: I1010 07:55:39.224787 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-25sj8"] Oct 10 07:55:39 crc kubenswrapper[4822]: I1010 07:55:39.271057 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-25sj8" event={"ID":"ae1da02d-0767-411e-bea7-b592b9ea37e1","Type":"ContainerStarted","Data":"25fcbec4826cca75eec95ec8918a1f6fd58171c4f5948e1987296cb4d412e618"} Oct 10 07:55:40 crc kubenswrapper[4822]: I1010 07:55:40.284391 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-25sj8" event={"ID":"ae1da02d-0767-411e-bea7-b592b9ea37e1","Type":"ContainerStarted","Data":"611c7c6d95a4c4aa00307343c9f6f6a5790a7d270b909e6a491738b09b67dd19"} Oct 10 07:55:40 crc kubenswrapper[4822]: I1010 07:55:40.306944 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-25sj8" podStartSLOduration=2.306924988 podStartE2EDuration="2.306924988s" podCreationTimestamp="2025-10-10 07:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:55:40.302407897 +0000 UTC m=+5487.397566113" watchObservedRunningTime="2025-10-10 07:55:40.306924988 +0000 UTC m=+5487.402083184" Oct 10 07:55:42 crc kubenswrapper[4822]: I1010 07:55:42.649105 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:42 crc kubenswrapper[4822]: I1010 07:55:42.649468 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:43 crc kubenswrapper[4822]: I1010 07:55:43.718018 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cg4ng" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="registry-server" probeResult="failure" output=< Oct 10 07:55:43 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 07:55:43 crc kubenswrapper[4822]: > Oct 10 07:55:45 crc kubenswrapper[4822]: I1010 07:55:45.331021 4822 generic.go:334] "Generic (PLEG): container finished" podID="ae1da02d-0767-411e-bea7-b592b9ea37e1" containerID="611c7c6d95a4c4aa00307343c9f6f6a5790a7d270b909e6a491738b09b67dd19" exitCode=0 Oct 10 07:55:45 crc kubenswrapper[4822]: I1010 07:55:45.331078 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-25sj8" event={"ID":"ae1da02d-0767-411e-bea7-b592b9ea37e1","Type":"ContainerDied","Data":"611c7c6d95a4c4aa00307343c9f6f6a5790a7d270b909e6a491738b09b67dd19"} Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.703115 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.838330 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb4ng\" (UniqueName: \"kubernetes.io/projected/ae1da02d-0767-411e-bea7-b592b9ea37e1-kube-api-access-rb4ng\") pod \"ae1da02d-0767-411e-bea7-b592b9ea37e1\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.838383 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-combined-ca-bundle\") pod \"ae1da02d-0767-411e-bea7-b592b9ea37e1\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.838580 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-db-sync-config-data\") pod \"ae1da02d-0767-411e-bea7-b592b9ea37e1\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.838602 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-config-data\") pod \"ae1da02d-0767-411e-bea7-b592b9ea37e1\" (UID: \"ae1da02d-0767-411e-bea7-b592b9ea37e1\") " Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.843640 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae1da02d-0767-411e-bea7-b592b9ea37e1-kube-api-access-rb4ng" (OuterVolumeSpecName: "kube-api-access-rb4ng") pod "ae1da02d-0767-411e-bea7-b592b9ea37e1" (UID: "ae1da02d-0767-411e-bea7-b592b9ea37e1"). InnerVolumeSpecName "kube-api-access-rb4ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.845483 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ae1da02d-0767-411e-bea7-b592b9ea37e1" (UID: "ae1da02d-0767-411e-bea7-b592b9ea37e1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.876515 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae1da02d-0767-411e-bea7-b592b9ea37e1" (UID: "ae1da02d-0767-411e-bea7-b592b9ea37e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.902733 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-config-data" (OuterVolumeSpecName: "config-data") pod "ae1da02d-0767-411e-bea7-b592b9ea37e1" (UID: "ae1da02d-0767-411e-bea7-b592b9ea37e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.941250 4822 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.941312 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.941335 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb4ng\" (UniqueName: \"kubernetes.io/projected/ae1da02d-0767-411e-bea7-b592b9ea37e1-kube-api-access-rb4ng\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:46 crc kubenswrapper[4822]: I1010 07:55:46.941351 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae1da02d-0767-411e-bea7-b592b9ea37e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.355709 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-25sj8" event={"ID":"ae1da02d-0767-411e-bea7-b592b9ea37e1","Type":"ContainerDied","Data":"25fcbec4826cca75eec95ec8918a1f6fd58171c4f5948e1987296cb4d412e618"} Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.356155 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fcbec4826cca75eec95ec8918a1f6fd58171c4f5948e1987296cb4d412e618" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.355849 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-25sj8" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.672093 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:47 crc kubenswrapper[4822]: E1010 07:55:47.672540 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1da02d-0767-411e-bea7-b592b9ea37e1" containerName="glance-db-sync" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.672561 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1da02d-0767-411e-bea7-b592b9ea37e1" containerName="glance-db-sync" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.672747 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae1da02d-0767-411e-bea7-b592b9ea37e1" containerName="glance-db-sync" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.673672 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.676059 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bc548" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.676244 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.676356 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.676499 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.697261 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.772844 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f7468c8c7-zph7v"] Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.774204 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.782515 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7468c8c7-zph7v"] Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.843863 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.845794 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.847624 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.858202 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rqz\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-kube-api-access-r4rqz\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859294 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859338 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859406 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-ceph\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859438 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859497 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.859514 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-logs\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.960771 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.960849 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rqz\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-kube-api-access-r4rqz\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.960892 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.960934 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.960979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961016 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961044 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961091 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-config\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961144 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-dns-svc\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961177 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-ceph\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961209 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961260 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsm6m\" (UniqueName: \"kubernetes.io/projected/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-kube-api-access-gsm6m\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961287 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961313 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961373 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8fm\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-kube-api-access-4t8fm\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961433 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.961458 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-logs\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.962406 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-logs\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.963250 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.971164 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.979997 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.980833 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.981023 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-ceph\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.984982 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rqz\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-kube-api-access-r4rqz\") pod \"glance-default-external-api-0\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:47 crc kubenswrapper[4822]: I1010 07:55:47.999549 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.063480 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.063859 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsm6m\" (UniqueName: \"kubernetes.io/projected/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-kube-api-access-gsm6m\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.063894 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.063939 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8fm\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-kube-api-access-4t8fm\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.063979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064027 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064069 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064125 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064149 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064186 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-config\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064238 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-dns-svc\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.064279 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.066092 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.066903 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.068050 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.071699 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-dns-svc\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.072137 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-config\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.072778 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.075169 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.080063 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.080573 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.086285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8fm\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-kube-api-access-4t8fm\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.088245 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsm6m\" (UniqueName: \"kubernetes.io/projected/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-kube-api-access-gsm6m\") pod \"dnsmasq-dns-6f7468c8c7-zph7v\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.092864 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.102508 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.168349 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.598227 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.627563 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7468c8c7-zph7v"] Oct 10 07:55:48 crc kubenswrapper[4822]: W1010 07:55:48.635843 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e8047f_1c20_4c8c_a1a4_a7314c7ce152.slice/crio-a39bb0d8e17040c080ffef332db05e292d408d96a30e404c525a524e8f74a5a7 WatchSource:0}: Error finding container a39bb0d8e17040c080ffef332db05e292d408d96a30e404c525a524e8f74a5a7: Status 404 returned error can't find the container with id a39bb0d8e17040c080ffef332db05e292d408d96a30e404c525a524e8f74a5a7 Oct 10 07:55:48 crc kubenswrapper[4822]: I1010 07:55:48.744156 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.051454 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.392227 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b1a39a1-9b59-489e-842d-d9f07ac76fd3","Type":"ContainerStarted","Data":"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8"} Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.392277 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b1a39a1-9b59-489e-842d-d9f07ac76fd3","Type":"ContainerStarted","Data":"6cf05db5b41eb220da5e1f5818ca12adbe8e141aae80f2b3f6bffa81ea192104"} Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.397580 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582","Type":"ContainerStarted","Data":"5f38efb0ba6eb98aaad709f93718f15ae06911517ee3c2df7ce58373f74cb5af"} Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.405180 4822 generic.go:334] "Generic (PLEG): container finished" podID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerID="dd599bc48401618093c1c2ddb48bb1c858e74067d738a73f477e0683faeb5120" exitCode=0 Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.405224 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" event={"ID":"55e8047f-1c20-4c8c-a1a4-a7314c7ce152","Type":"ContainerDied","Data":"dd599bc48401618093c1c2ddb48bb1c858e74067d738a73f477e0683faeb5120"} Oct 10 07:55:49 crc kubenswrapper[4822]: I1010 07:55:49.405250 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" event={"ID":"55e8047f-1c20-4c8c-a1a4-a7314c7ce152","Type":"ContainerStarted","Data":"a39bb0d8e17040c080ffef332db05e292d408d96a30e404c525a524e8f74a5a7"} Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.415774 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582","Type":"ContainerStarted","Data":"1f2abaadefa4e83daa9d045318a4a527c5ae016cf81b7a3b040279cc5d712b2d"} Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.416278 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582","Type":"ContainerStarted","Data":"512d72df6a2d948d275bc0e43ce567e951b71fc929bc8d40e8ed4a19307caa2c"} Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.418640 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" event={"ID":"55e8047f-1c20-4c8c-a1a4-a7314c7ce152","Type":"ContainerStarted","Data":"29ace8d0d9db8a4c779fe59823d3232b89eedf7b4cf3542178e8bb889c20b669"} Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.418932 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.421449 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b1a39a1-9b59-489e-842d-d9f07ac76fd3","Type":"ContainerStarted","Data":"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6"} Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.421563 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-log" containerID="cri-o://947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8" gracePeriod=30 Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.421771 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-httpd" containerID="cri-o://a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6" gracePeriod=30 Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.436310 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.436295495 podStartE2EDuration="3.436295495s" podCreationTimestamp="2025-10-10 07:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:55:50.431999381 +0000 UTC m=+5497.527157577" watchObservedRunningTime="2025-10-10 07:55:50.436295495 +0000 UTC m=+5497.531453691" Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.458491 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" podStartSLOduration=3.458474794 podStartE2EDuration="3.458474794s" podCreationTimestamp="2025-10-10 07:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:55:50.450205956 +0000 UTC m=+5497.545364212" watchObservedRunningTime="2025-10-10 07:55:50.458474794 +0000 UTC m=+5497.553632990" Oct 10 07:55:50 crc kubenswrapper[4822]: I1010 07:55:50.481622 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.481592111 podStartE2EDuration="3.481592111s" podCreationTimestamp="2025-10-10 07:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:55:50.467093543 +0000 UTC m=+5497.562251799" watchObservedRunningTime="2025-10-10 07:55:50.481592111 +0000 UTC m=+5497.576750337" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.041447 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237168 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-logs\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237238 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4rqz\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-kube-api-access-r4rqz\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237291 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-httpd-run\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237377 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-scripts\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237410 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-ceph\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237435 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-config-data\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237507 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-combined-ca-bundle\") pod \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\" (UID: \"8b1a39a1-9b59-489e-842d-d9f07ac76fd3\") " Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237670 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.237693 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-logs" (OuterVolumeSpecName: "logs") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.238532 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.238559 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.242963 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-ceph" (OuterVolumeSpecName: "ceph") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.252646 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-kube-api-access-r4rqz" (OuterVolumeSpecName: "kube-api-access-r4rqz") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "kube-api-access-r4rqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.254627 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-scripts" (OuterVolumeSpecName: "scripts") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.268199 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.295484 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-config-data" (OuterVolumeSpecName: "config-data") pod "8b1a39a1-9b59-489e-842d-d9f07ac76fd3" (UID: "8b1a39a1-9b59-489e-842d-d9f07ac76fd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.329659 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.345027 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.345064 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.345074 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.345083 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.345094 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4rqz\" (UniqueName: \"kubernetes.io/projected/8b1a39a1-9b59-489e-842d-d9f07ac76fd3-kube-api-access-r4rqz\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431694 4822 generic.go:334] "Generic (PLEG): container finished" podID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerID="a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6" exitCode=0 Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431721 4822 generic.go:334] "Generic (PLEG): container finished" podID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerID="947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8" exitCode=143 Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431776 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b1a39a1-9b59-489e-842d-d9f07ac76fd3","Type":"ContainerDied","Data":"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6"} Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431846 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b1a39a1-9b59-489e-842d-d9f07ac76fd3","Type":"ContainerDied","Data":"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8"} Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431860 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b1a39a1-9b59-489e-842d-d9f07ac76fd3","Type":"ContainerDied","Data":"6cf05db5b41eb220da5e1f5818ca12adbe8e141aae80f2b3f6bffa81ea192104"} Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431879 4822 scope.go:117] "RemoveContainer" containerID="a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.431894 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.462686 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.471875 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.474433 4822 scope.go:117] "RemoveContainer" containerID="947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.489432 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:51 crc kubenswrapper[4822]: E1010 07:55:51.490167 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-log" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.490235 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-log" Oct 10 07:55:51 crc kubenswrapper[4822]: E1010 07:55:51.490302 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-httpd" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.490358 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-httpd" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.490628 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-log" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.490718 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" containerName="glance-httpd" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.491674 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.497354 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.500156 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.517917 4822 scope.go:117] "RemoveContainer" containerID="a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6" Oct 10 07:55:51 crc kubenswrapper[4822]: E1010 07:55:51.518338 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6\": container with ID starting with a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6 not found: ID does not exist" containerID="a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.518369 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6"} err="failed to get container status \"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6\": rpc error: code = NotFound desc = could not find container \"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6\": container with ID starting with a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6 not found: ID does not exist" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.518394 4822 scope.go:117] "RemoveContainer" containerID="947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8" Oct 10 07:55:51 crc kubenswrapper[4822]: E1010 07:55:51.518699 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8\": container with ID starting with 947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8 not found: ID does not exist" containerID="947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.518725 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8"} err="failed to get container status \"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8\": rpc error: code = NotFound desc = could not find container \"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8\": container with ID starting with 947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8 not found: ID does not exist" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.518744 4822 scope.go:117] "RemoveContainer" containerID="a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.520159 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6"} err="failed to get container status \"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6\": rpc error: code = NotFound desc = could not find container \"a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6\": container with ID starting with a0a675203f257ea851cb61dab2852af25df9198835705b695462beb916fb1eb6 not found: ID does not exist" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.520214 4822 scope.go:117] "RemoveContainer" containerID="947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.520476 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8"} err="failed to get container status \"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8\": rpc error: code = NotFound desc = could not find container \"947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8\": container with ID starting with 947ae7cc142d55f3b97beb9a868b48ed3cc1fe130e496cb2fb5b75661b2c3cc8 not found: ID does not exist" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.652944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.652999 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-logs\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.653061 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-ceph\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.653109 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.653761 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-config-data\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.653961 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-scripts\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.654049 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzg7l\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-kube-api-access-mzg7l\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.663193 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b1a39a1-9b59-489e-842d-d9f07ac76fd3" path="/var/lib/kubelet/pods/8b1a39a1-9b59-489e-842d-d9f07ac76fd3/volumes" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755256 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-logs\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755351 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-ceph\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755416 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755487 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-config-data\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755533 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-scripts\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755555 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzg7l\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-kube-api-access-mzg7l\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.755593 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.756160 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-logs\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.756176 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.759476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-scripts\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.759593 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-ceph\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.763709 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-config-data\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.767750 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.773545 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzg7l\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-kube-api-access-mzg7l\") pod \"glance-default-external-api-0\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " pod="openstack/glance-default-external-api-0" Oct 10 07:55:51 crc kubenswrapper[4822]: I1010 07:55:51.812708 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.380663 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.440858 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9ccc430e-b717-4005-927a-f062c96fb139","Type":"ContainerStarted","Data":"e40fafa7b80907f7f22962124936f5097a44a7ec4818d1155e9e7508ab5c691e"} Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.442949 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-log" containerID="cri-o://512d72df6a2d948d275bc0e43ce567e951b71fc929bc8d40e8ed4a19307caa2c" gracePeriod=30 Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.442989 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-httpd" containerID="cri-o://1f2abaadefa4e83daa9d045318a4a527c5ae016cf81b7a3b040279cc5d712b2d" gracePeriod=30 Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.707198 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.770069 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:52 crc kubenswrapper[4822]: I1010 07:55:52.949103 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cg4ng"] Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.458419 4822 generic.go:334] "Generic (PLEG): container finished" podID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerID="1f2abaadefa4e83daa9d045318a4a527c5ae016cf81b7a3b040279cc5d712b2d" exitCode=0 Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.458794 4822 generic.go:334] "Generic (PLEG): container finished" podID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerID="512d72df6a2d948d275bc0e43ce567e951b71fc929bc8d40e8ed4a19307caa2c" exitCode=143 Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.458534 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582","Type":"ContainerDied","Data":"1f2abaadefa4e83daa9d045318a4a527c5ae016cf81b7a3b040279cc5d712b2d"} Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.458907 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582","Type":"ContainerDied","Data":"512d72df6a2d948d275bc0e43ce567e951b71fc929bc8d40e8ed4a19307caa2c"} Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.461573 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9ccc430e-b717-4005-927a-f062c96fb139","Type":"ContainerStarted","Data":"07c4a2944eb3eb0497406fc3f71936bc9d7dfb3bfee35c47e482b5f978d16563"} Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.601503 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789289 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-config-data\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789640 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-ceph\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789693 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t8fm\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-kube-api-access-4t8fm\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789739 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-combined-ca-bundle\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789783 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-httpd-run\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789820 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-logs\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.789843 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-scripts\") pod \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\" (UID: \"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582\") " Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.790717 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-logs" (OuterVolumeSpecName: "logs") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.790861 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.795786 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-scripts" (OuterVolumeSpecName: "scripts") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.795965 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-ceph" (OuterVolumeSpecName: "ceph") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.796588 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-kube-api-access-4t8fm" (OuterVolumeSpecName: "kube-api-access-4t8fm") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "kube-api-access-4t8fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.829163 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.857220 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-config-data" (OuterVolumeSpecName: "config-data") pod "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" (UID: "e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892654 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892703 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892718 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t8fm\" (UniqueName: \"kubernetes.io/projected/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-kube-api-access-4t8fm\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892733 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892746 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892757 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:53 crc kubenswrapper[4822]: I1010 07:55:53.892768 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.480174 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582","Type":"ContainerDied","Data":"5f38efb0ba6eb98aaad709f93718f15ae06911517ee3c2df7ce58373f74cb5af"} Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.480251 4822 scope.go:117] "RemoveContainer" containerID="1f2abaadefa4e83daa9d045318a4a527c5ae016cf81b7a3b040279cc5d712b2d" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.480268 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.483644 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9ccc430e-b717-4005-927a-f062c96fb139","Type":"ContainerStarted","Data":"92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6"} Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.483797 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cg4ng" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="registry-server" containerID="cri-o://5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b" gracePeriod=2 Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.518567 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.518547303 podStartE2EDuration="3.518547303s" podCreationTimestamp="2025-10-10 07:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:55:54.516941307 +0000 UTC m=+5501.612099563" watchObservedRunningTime="2025-10-10 07:55:54.518547303 +0000 UTC m=+5501.613705509" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.534403 4822 scope.go:117] "RemoveContainer" containerID="512d72df6a2d948d275bc0e43ce567e951b71fc929bc8d40e8ed4a19307caa2c" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.561114 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.573249 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.605745 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:54 crc kubenswrapper[4822]: E1010 07:55:54.606162 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-httpd" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.606183 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-httpd" Oct 10 07:55:54 crc kubenswrapper[4822]: E1010 07:55:54.606200 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-log" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.606208 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-log" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.606425 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-httpd" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.606445 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" containerName="glance-log" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.607585 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.610845 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.652979 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.710935 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-logs\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.711036 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.711155 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.711304 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.711496 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.711627 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxb2x\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-kube-api-access-vxb2x\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.711702 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-ceph\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.812796 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.812860 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.812908 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.812954 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.812993 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxb2x\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-kube-api-access-vxb2x\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.813025 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-ceph\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.813044 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-logs\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.813507 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-logs\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.813767 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.818505 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.819208 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.819434 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-ceph\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.820688 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.835067 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxb2x\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-kube-api-access-vxb2x\") pod \"glance-default-internal-api-0\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:55:54 crc kubenswrapper[4822]: E1010 07:55:54.842300 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53eb906a_5dcb_4aac_b70f_ab90a6d5ba70.slice/crio-conmon-5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:55:54 crc kubenswrapper[4822]: I1010 07:55:54.993670 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.041650 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.120741 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-utilities" (OuterVolumeSpecName: "utilities") pod "53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" (UID: "53eb906a-5dcb-4aac-b70f-ab90a6d5ba70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.118851 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-utilities\") pod \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.123441 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-catalog-content\") pod \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.125992 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9z85\" (UniqueName: \"kubernetes.io/projected/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-kube-api-access-p9z85\") pod \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\" (UID: \"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70\") " Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.127355 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.132561 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-kube-api-access-p9z85" (OuterVolumeSpecName: "kube-api-access-p9z85") pod "53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" (UID: "53eb906a-5dcb-4aac-b70f-ab90a6d5ba70"). InnerVolumeSpecName "kube-api-access-p9z85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.229106 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9z85\" (UniqueName: \"kubernetes.io/projected/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-kube-api-access-p9z85\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.230984 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" (UID: "53eb906a-5dcb-4aac-b70f-ab90a6d5ba70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.332216 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.502229 4822 generic.go:334] "Generic (PLEG): container finished" podID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerID="5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b" exitCode=0 Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.502296 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerDied","Data":"5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b"} Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.502396 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg4ng" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.502640 4822 scope.go:117] "RemoveContainer" containerID="5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.502624 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg4ng" event={"ID":"53eb906a-5dcb-4aac-b70f-ab90a6d5ba70","Type":"ContainerDied","Data":"4c28b0d47176255dc9ce57fde53ca505a90a4772ff442702148cc12ff75cba6c"} Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.555489 4822 scope.go:117] "RemoveContainer" containerID="5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.561788 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cg4ng"] Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.578905 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cg4ng"] Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.594227 4822 scope.go:117] "RemoveContainer" containerID="506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.625938 4822 scope.go:117] "RemoveContainer" containerID="5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b" Oct 10 07:55:55 crc kubenswrapper[4822]: E1010 07:55:55.626429 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b\": container with ID starting with 5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b not found: ID does not exist" containerID="5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.626467 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b"} err="failed to get container status \"5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b\": rpc error: code = NotFound desc = could not find container \"5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b\": container with ID starting with 5e218f0cb3e549b2fd7f4d9a2a74837989b4eb810fb51f4422fea81462653c5b not found: ID does not exist" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.626492 4822 scope.go:117] "RemoveContainer" containerID="5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8" Oct 10 07:55:55 crc kubenswrapper[4822]: E1010 07:55:55.626748 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8\": container with ID starting with 5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8 not found: ID does not exist" containerID="5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.626778 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8"} err="failed to get container status \"5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8\": rpc error: code = NotFound desc = could not find container \"5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8\": container with ID starting with 5da20d1137a0960322f7b7caed495ad376bfbf8368433f262d2416c9326e98d8 not found: ID does not exist" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.626794 4822 scope.go:117] "RemoveContainer" containerID="506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9" Oct 10 07:55:55 crc kubenswrapper[4822]: E1010 07:55:55.627170 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9\": container with ID starting with 506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9 not found: ID does not exist" containerID="506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.627202 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9"} err="failed to get container status \"506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9\": rpc error: code = NotFound desc = could not find container \"506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9\": container with ID starting with 506db7c0a627334b1ddabebcee5993738269cc20e6f29a0591ec4cdfdc1f62c9 not found: ID does not exist" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.631226 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:55:55 crc kubenswrapper[4822]: W1010 07:55:55.637090 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84dbf6b3_4cfd_4ebf_946f_3a9dbd1fa393.slice/crio-9b2083b3d05efe7390f4fd4f9d2246537865127be7c221b46bf13f1d200b942f WatchSource:0}: Error finding container 9b2083b3d05efe7390f4fd4f9d2246537865127be7c221b46bf13f1d200b942f: Status 404 returned error can't find the container with id 9b2083b3d05efe7390f4fd4f9d2246537865127be7c221b46bf13f1d200b942f Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.668498 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" path="/var/lib/kubelet/pods/53eb906a-5dcb-4aac-b70f-ab90a6d5ba70/volumes" Oct 10 07:55:55 crc kubenswrapper[4822]: I1010 07:55:55.670225 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582" path="/var/lib/kubelet/pods/e8f3e44c-604b-4d3f-8f9f-b0a0dc5ad582/volumes" Oct 10 07:55:56 crc kubenswrapper[4822]: I1010 07:55:56.519460 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393","Type":"ContainerStarted","Data":"d2e164305500f7e412ebe8483345263a4d55b63d7b10052cd8fba1d54e6c4751"} Oct 10 07:55:56 crc kubenswrapper[4822]: I1010 07:55:56.519786 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393","Type":"ContainerStarted","Data":"9b2083b3d05efe7390f4fd4f9d2246537865127be7c221b46bf13f1d200b942f"} Oct 10 07:55:57 crc kubenswrapper[4822]: I1010 07:55:57.536332 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393","Type":"ContainerStarted","Data":"f110b0d579a08a84d345f309cd94acdd4fd3801b73f320ec5ed1e10d464a4793"} Oct 10 07:55:57 crc kubenswrapper[4822]: I1010 07:55:57.565928 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.565907304 podStartE2EDuration="3.565907304s" podCreationTimestamp="2025-10-10 07:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:55:57.562569008 +0000 UTC m=+5504.657727274" watchObservedRunningTime="2025-10-10 07:55:57.565907304 +0000 UTC m=+5504.661065500" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.105287 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.196433 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4"] Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.196797 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerName="dnsmasq-dns" containerID="cri-o://ec5517c3ca83f7e4c07fa36912539aace99baa2aa8f693a68e321c15619ec0a1" gracePeriod=10 Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.550279 4822 generic.go:334] "Generic (PLEG): container finished" podID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerID="ec5517c3ca83f7e4c07fa36912539aace99baa2aa8f693a68e321c15619ec0a1" exitCode=0 Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.550467 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" event={"ID":"d0bbda27-59a2-41ef-9e12-c5c3abf679e1","Type":"ContainerDied","Data":"ec5517c3ca83f7e4c07fa36912539aace99baa2aa8f693a68e321c15619ec0a1"} Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.711880 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.816296 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-nb\") pod \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.816383 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-sb\") pod \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.816434 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-config\") pod \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.816494 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-dns-svc\") pod \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.816560 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mw4\" (UniqueName: \"kubernetes.io/projected/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-kube-api-access-54mw4\") pod \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\" (UID: \"d0bbda27-59a2-41ef-9e12-c5c3abf679e1\") " Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.825156 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-kube-api-access-54mw4" (OuterVolumeSpecName: "kube-api-access-54mw4") pod "d0bbda27-59a2-41ef-9e12-c5c3abf679e1" (UID: "d0bbda27-59a2-41ef-9e12-c5c3abf679e1"). InnerVolumeSpecName "kube-api-access-54mw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.861223 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-config" (OuterVolumeSpecName: "config") pod "d0bbda27-59a2-41ef-9e12-c5c3abf679e1" (UID: "d0bbda27-59a2-41ef-9e12-c5c3abf679e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.877153 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0bbda27-59a2-41ef-9e12-c5c3abf679e1" (UID: "d0bbda27-59a2-41ef-9e12-c5c3abf679e1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.880954 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0bbda27-59a2-41ef-9e12-c5c3abf679e1" (UID: "d0bbda27-59a2-41ef-9e12-c5c3abf679e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.893650 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0bbda27-59a2-41ef-9e12-c5c3abf679e1" (UID: "d0bbda27-59a2-41ef-9e12-c5c3abf679e1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.918222 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.918486 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54mw4\" (UniqueName: \"kubernetes.io/projected/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-kube-api-access-54mw4\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.918555 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.918623 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:58 crc kubenswrapper[4822]: I1010 07:55:58.918678 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0bbda27-59a2-41ef-9e12-c5c3abf679e1-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.565862 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" event={"ID":"d0bbda27-59a2-41ef-9e12-c5c3abf679e1","Type":"ContainerDied","Data":"3bec7b2fc39ea630dfd9ad9f6a135b3b78da2f33e70d80a02fbbe99a2711c556"} Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.566404 4822 scope.go:117] "RemoveContainer" containerID="ec5517c3ca83f7e4c07fa36912539aace99baa2aa8f693a68e321c15619ec0a1" Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.565923 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4" Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.605947 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4"] Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.613666 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cf9bd4d4f-6gcq4"] Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.617925 4822 scope.go:117] "RemoveContainer" containerID="31d9f31ce2d76dafbbfc8072cdb4f7e5fe395bc919d7cc534ce32157ca4fc3f5" Oct 10 07:55:59 crc kubenswrapper[4822]: I1010 07:55:59.681631 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" path="/var/lib/kubelet/pods/d0bbda27-59a2-41ef-9e12-c5c3abf679e1/volumes" Oct 10 07:56:01 crc kubenswrapper[4822]: I1010 07:56:01.337350 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:56:01 crc kubenswrapper[4822]: I1010 07:56:01.337752 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:56:01 crc kubenswrapper[4822]: I1010 07:56:01.813709 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 07:56:01 crc kubenswrapper[4822]: I1010 07:56:01.814295 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 07:56:01 crc kubenswrapper[4822]: I1010 07:56:01.867317 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 07:56:01 crc kubenswrapper[4822]: I1010 07:56:01.883722 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 07:56:02 crc kubenswrapper[4822]: I1010 07:56:02.608842 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 07:56:02 crc kubenswrapper[4822]: I1010 07:56:02.608917 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 07:56:04 crc kubenswrapper[4822]: I1010 07:56:04.545753 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 07:56:04 crc kubenswrapper[4822]: I1010 07:56:04.600271 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 07:56:05 crc kubenswrapper[4822]: I1010 07:56:05.042478 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:05 crc kubenswrapper[4822]: I1010 07:56:05.042537 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:05 crc kubenswrapper[4822]: I1010 07:56:05.075536 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:05 crc kubenswrapper[4822]: I1010 07:56:05.093729 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:05 crc kubenswrapper[4822]: I1010 07:56:05.640713 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:05 crc kubenswrapper[4822]: I1010 07:56:05.640774 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:07 crc kubenswrapper[4822]: I1010 07:56:07.509228 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:07 crc kubenswrapper[4822]: I1010 07:56:07.512913 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.916913 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-qqv8w"] Oct 10 07:56:13 crc kubenswrapper[4822]: E1010 07:56:13.917775 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerName="dnsmasq-dns" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.917788 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerName="dnsmasq-dns" Oct 10 07:56:13 crc kubenswrapper[4822]: E1010 07:56:13.917798 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="extract-content" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.917819 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="extract-content" Oct 10 07:56:13 crc kubenswrapper[4822]: E1010 07:56:13.917841 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="extract-utilities" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.917848 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="extract-utilities" Oct 10 07:56:13 crc kubenswrapper[4822]: E1010 07:56:13.917860 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="registry-server" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.917866 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="registry-server" Oct 10 07:56:13 crc kubenswrapper[4822]: E1010 07:56:13.917881 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerName="init" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.917886 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerName="init" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.918044 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bbda27-59a2-41ef-9e12-c5c3abf679e1" containerName="dnsmasq-dns" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.918058 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="53eb906a-5dcb-4aac-b70f-ab90a6d5ba70" containerName="registry-server" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.918711 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:13 crc kubenswrapper[4822]: I1010 07:56:13.928028 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qqv8w"] Oct 10 07:56:14 crc kubenswrapper[4822]: I1010 07:56:14.048918 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ght4k\" (UniqueName: \"kubernetes.io/projected/4c56f1f0-0da4-4966-a2ce-2705737f9764-kube-api-access-ght4k\") pod \"placement-db-create-qqv8w\" (UID: \"4c56f1f0-0da4-4966-a2ce-2705737f9764\") " pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:14 crc kubenswrapper[4822]: I1010 07:56:14.150444 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ght4k\" (UniqueName: \"kubernetes.io/projected/4c56f1f0-0da4-4966-a2ce-2705737f9764-kube-api-access-ght4k\") pod \"placement-db-create-qqv8w\" (UID: \"4c56f1f0-0da4-4966-a2ce-2705737f9764\") " pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:14 crc kubenswrapper[4822]: I1010 07:56:14.169367 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ght4k\" (UniqueName: \"kubernetes.io/projected/4c56f1f0-0da4-4966-a2ce-2705737f9764-kube-api-access-ght4k\") pod \"placement-db-create-qqv8w\" (UID: \"4c56f1f0-0da4-4966-a2ce-2705737f9764\") " pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:14 crc kubenswrapper[4822]: I1010 07:56:14.244011 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:14 crc kubenswrapper[4822]: I1010 07:56:14.679726 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-qqv8w"] Oct 10 07:56:14 crc kubenswrapper[4822]: I1010 07:56:14.727458 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qqv8w" event={"ID":"4c56f1f0-0da4-4966-a2ce-2705737f9764","Type":"ContainerStarted","Data":"9f3066846355b1b09885ee6134e010818a4d92f24ba8bcffdf9522f89cf977b8"} Oct 10 07:56:15 crc kubenswrapper[4822]: I1010 07:56:15.744044 4822 generic.go:334] "Generic (PLEG): container finished" podID="4c56f1f0-0da4-4966-a2ce-2705737f9764" containerID="93f719af86eb5f09d2e201096e3c52765c986af3cb8b7a9d681e89e7191d0f97" exitCode=0 Oct 10 07:56:15 crc kubenswrapper[4822]: I1010 07:56:15.744157 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qqv8w" event={"ID":"4c56f1f0-0da4-4966-a2ce-2705737f9764","Type":"ContainerDied","Data":"93f719af86eb5f09d2e201096e3c52765c986af3cb8b7a9d681e89e7191d0f97"} Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.113995 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.209820 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ght4k\" (UniqueName: \"kubernetes.io/projected/4c56f1f0-0da4-4966-a2ce-2705737f9764-kube-api-access-ght4k\") pod \"4c56f1f0-0da4-4966-a2ce-2705737f9764\" (UID: \"4c56f1f0-0da4-4966-a2ce-2705737f9764\") " Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.215918 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c56f1f0-0da4-4966-a2ce-2705737f9764-kube-api-access-ght4k" (OuterVolumeSpecName: "kube-api-access-ght4k") pod "4c56f1f0-0da4-4966-a2ce-2705737f9764" (UID: "4c56f1f0-0da4-4966-a2ce-2705737f9764"). InnerVolumeSpecName "kube-api-access-ght4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.312408 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ght4k\" (UniqueName: \"kubernetes.io/projected/4c56f1f0-0da4-4966-a2ce-2705737f9764-kube-api-access-ght4k\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.775235 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-qqv8w" event={"ID":"4c56f1f0-0da4-4966-a2ce-2705737f9764","Type":"ContainerDied","Data":"9f3066846355b1b09885ee6134e010818a4d92f24ba8bcffdf9522f89cf977b8"} Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.775285 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f3066846355b1b09885ee6134e010818a4d92f24ba8bcffdf9522f89cf977b8" Oct 10 07:56:17 crc kubenswrapper[4822]: I1010 07:56:17.775312 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-qqv8w" Oct 10 07:56:23 crc kubenswrapper[4822]: I1010 07:56:23.936371 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a241-account-create-dzbjp"] Oct 10 07:56:23 crc kubenswrapper[4822]: E1010 07:56:23.937268 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c56f1f0-0da4-4966-a2ce-2705737f9764" containerName="mariadb-database-create" Oct 10 07:56:23 crc kubenswrapper[4822]: I1010 07:56:23.937283 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c56f1f0-0da4-4966-a2ce-2705737f9764" containerName="mariadb-database-create" Oct 10 07:56:23 crc kubenswrapper[4822]: I1010 07:56:23.937477 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c56f1f0-0da4-4966-a2ce-2705737f9764" containerName="mariadb-database-create" Oct 10 07:56:23 crc kubenswrapper[4822]: I1010 07:56:23.938270 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:23 crc kubenswrapper[4822]: I1010 07:56:23.944113 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 10 07:56:23 crc kubenswrapper[4822]: I1010 07:56:23.950276 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a241-account-create-dzbjp"] Oct 10 07:56:24 crc kubenswrapper[4822]: I1010 07:56:24.042060 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrvs\" (UniqueName: \"kubernetes.io/projected/3e6ec260-52e1-456e-aac9-19c02953a0e6-kube-api-access-rzrvs\") pod \"placement-a241-account-create-dzbjp\" (UID: \"3e6ec260-52e1-456e-aac9-19c02953a0e6\") " pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:24 crc kubenswrapper[4822]: I1010 07:56:24.144035 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrvs\" (UniqueName: \"kubernetes.io/projected/3e6ec260-52e1-456e-aac9-19c02953a0e6-kube-api-access-rzrvs\") pod \"placement-a241-account-create-dzbjp\" (UID: \"3e6ec260-52e1-456e-aac9-19c02953a0e6\") " pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:24 crc kubenswrapper[4822]: I1010 07:56:24.176110 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrvs\" (UniqueName: \"kubernetes.io/projected/3e6ec260-52e1-456e-aac9-19c02953a0e6-kube-api-access-rzrvs\") pod \"placement-a241-account-create-dzbjp\" (UID: \"3e6ec260-52e1-456e-aac9-19c02953a0e6\") " pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:24 crc kubenswrapper[4822]: I1010 07:56:24.254948 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:24 crc kubenswrapper[4822]: I1010 07:56:24.766387 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a241-account-create-dzbjp"] Oct 10 07:56:24 crc kubenswrapper[4822]: I1010 07:56:24.865144 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a241-account-create-dzbjp" event={"ID":"3e6ec260-52e1-456e-aac9-19c02953a0e6","Type":"ContainerStarted","Data":"b2d743e9c082c1794d338861aa7e086a647ea6cd0c31061d7218b437326a02a3"} Oct 10 07:56:25 crc kubenswrapper[4822]: I1010 07:56:25.880684 4822 generic.go:334] "Generic (PLEG): container finished" podID="3e6ec260-52e1-456e-aac9-19c02953a0e6" containerID="78f555350bc8a215b3b7367b72385696b789d7a0ed01dd7a9bf98e94ce2406ca" exitCode=0 Oct 10 07:56:25 crc kubenswrapper[4822]: I1010 07:56:25.880841 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a241-account-create-dzbjp" event={"ID":"3e6ec260-52e1-456e-aac9-19c02953a0e6","Type":"ContainerDied","Data":"78f555350bc8a215b3b7367b72385696b789d7a0ed01dd7a9bf98e94ce2406ca"} Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.229093 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.402973 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzrvs\" (UniqueName: \"kubernetes.io/projected/3e6ec260-52e1-456e-aac9-19c02953a0e6-kube-api-access-rzrvs\") pod \"3e6ec260-52e1-456e-aac9-19c02953a0e6\" (UID: \"3e6ec260-52e1-456e-aac9-19c02953a0e6\") " Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.411953 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6ec260-52e1-456e-aac9-19c02953a0e6-kube-api-access-rzrvs" (OuterVolumeSpecName: "kube-api-access-rzrvs") pod "3e6ec260-52e1-456e-aac9-19c02953a0e6" (UID: "3e6ec260-52e1-456e-aac9-19c02953a0e6"). InnerVolumeSpecName "kube-api-access-rzrvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.508566 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzrvs\" (UniqueName: \"kubernetes.io/projected/3e6ec260-52e1-456e-aac9-19c02953a0e6-kube-api-access-rzrvs\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.913844 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a241-account-create-dzbjp" event={"ID":"3e6ec260-52e1-456e-aac9-19c02953a0e6","Type":"ContainerDied","Data":"b2d743e9c082c1794d338861aa7e086a647ea6cd0c31061d7218b437326a02a3"} Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.913954 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2d743e9c082c1794d338861aa7e086a647ea6cd0c31061d7218b437326a02a3" Oct 10 07:56:27 crc kubenswrapper[4822]: I1010 07:56:27.914164 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a241-account-create-dzbjp" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.238908 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7869c9d85c-xv8n5"] Oct 10 07:56:29 crc kubenswrapper[4822]: E1010 07:56:29.239423 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6ec260-52e1-456e-aac9-19c02953a0e6" containerName="mariadb-account-create" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.239442 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6ec260-52e1-456e-aac9-19c02953a0e6" containerName="mariadb-account-create" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.239690 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6ec260-52e1-456e-aac9-19c02953a0e6" containerName="mariadb-account-create" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.242139 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.260523 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7869c9d85c-xv8n5"] Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.299646 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rxxnx"] Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.301365 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.304179 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.304664 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zpxrn" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.305071 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.316578 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rxxnx"] Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.354702 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-sb\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.354751 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.354798 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-dns-svc\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.354880 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-config\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.354901 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtft\" (UniqueName: \"kubernetes.io/projected/18e91592-f700-4c18-b07f-0abe2e262fd9-kube-api-access-qrtft\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.456715 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-scripts\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.456778 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-sb\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.456803 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.456935 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-config-data\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457000 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-dns-svc\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457022 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-combined-ca-bundle\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457261 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab26663-d9d5-425f-9eee-36df23b8ce23-logs\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457308 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-config\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457336 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtft\" (UniqueName: \"kubernetes.io/projected/18e91592-f700-4c18-b07f-0abe2e262fd9-kube-api-access-qrtft\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr47w\" (UniqueName: \"kubernetes.io/projected/3ab26663-d9d5-425f-9eee-36df23b8ce23-kube-api-access-gr47w\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457719 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-nb\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.457719 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-sb\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.458285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-dns-svc\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.458666 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-config\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.479421 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtft\" (UniqueName: \"kubernetes.io/projected/18e91592-f700-4c18-b07f-0abe2e262fd9-kube-api-access-qrtft\") pod \"dnsmasq-dns-7869c9d85c-xv8n5\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.558717 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-config-data\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.558780 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-combined-ca-bundle\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.558890 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab26663-d9d5-425f-9eee-36df23b8ce23-logs\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.558941 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr47w\" (UniqueName: \"kubernetes.io/projected/3ab26663-d9d5-425f-9eee-36df23b8ce23-kube-api-access-gr47w\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.558974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-scripts\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.559466 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab26663-d9d5-425f-9eee-36df23b8ce23-logs\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.562907 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-config-data\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.563567 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-combined-ca-bundle\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.563897 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-scripts\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.578232 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.583880 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr47w\" (UniqueName: \"kubernetes.io/projected/3ab26663-d9d5-425f-9eee-36df23b8ce23-kube-api-access-gr47w\") pod \"placement-db-sync-rxxnx\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.631335 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.921622 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rxxnx"] Oct 10 07:56:29 crc kubenswrapper[4822]: I1010 07:56:29.940626 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rxxnx" event={"ID":"3ab26663-d9d5-425f-9eee-36df23b8ce23","Type":"ContainerStarted","Data":"39759b3c26d06ca740b0513b1469252d60a215fff66715815a7ee41303ee8095"} Oct 10 07:56:30 crc kubenswrapper[4822]: I1010 07:56:30.040517 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7869c9d85c-xv8n5"] Oct 10 07:56:30 crc kubenswrapper[4822]: W1010 07:56:30.062278 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e91592_f700_4c18_b07f_0abe2e262fd9.slice/crio-1742a6f5c62da5729998476e833b5dd09aa0d32ef265731447bf969685ba5fac WatchSource:0}: Error finding container 1742a6f5c62da5729998476e833b5dd09aa0d32ef265731447bf969685ba5fac: Status 404 returned error can't find the container with id 1742a6f5c62da5729998476e833b5dd09aa0d32ef265731447bf969685ba5fac Oct 10 07:56:30 crc kubenswrapper[4822]: I1010 07:56:30.953966 4822 generic.go:334] "Generic (PLEG): container finished" podID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerID="8146239dbcce20485e7eeb7d6ff104665aaf9db1e6934d244427e97b34b9e83c" exitCode=0 Oct 10 07:56:30 crc kubenswrapper[4822]: I1010 07:56:30.954048 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" event={"ID":"18e91592-f700-4c18-b07f-0abe2e262fd9","Type":"ContainerDied","Data":"8146239dbcce20485e7eeb7d6ff104665aaf9db1e6934d244427e97b34b9e83c"} Oct 10 07:56:30 crc kubenswrapper[4822]: I1010 07:56:30.954106 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" event={"ID":"18e91592-f700-4c18-b07f-0abe2e262fd9","Type":"ContainerStarted","Data":"1742a6f5c62da5729998476e833b5dd09aa0d32ef265731447bf969685ba5fac"} Oct 10 07:56:30 crc kubenswrapper[4822]: I1010 07:56:30.957764 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rxxnx" event={"ID":"3ab26663-d9d5-425f-9eee-36df23b8ce23","Type":"ContainerStarted","Data":"ba815ef758455e238fed76b31d1a6d4aaef1ab82a0635e4498de556f5a0ec6ed"} Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.022035 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rxxnx" podStartSLOduration=2.021986732 podStartE2EDuration="2.021986732s" podCreationTimestamp="2025-10-10 07:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:56:31.009958685 +0000 UTC m=+5538.105116921" watchObservedRunningTime="2025-10-10 07:56:31.021986732 +0000 UTC m=+5538.117144948" Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.336517 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.336600 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.336651 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.337652 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6be1afa435827691537f13fe12c13c00ff6d1428cd681ee10e85bde839016aa2"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.337747 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://6be1afa435827691537f13fe12c13c00ff6d1428cd681ee10e85bde839016aa2" gracePeriod=600 Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.975699 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" event={"ID":"18e91592-f700-4c18-b07f-0abe2e262fd9","Type":"ContainerStarted","Data":"b5d18e77a7b45c64932a5b7b623579aea712bab93225be2fa3fcc6d917a65934"} Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.977859 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.977897 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rxxnx" event={"ID":"3ab26663-d9d5-425f-9eee-36df23b8ce23","Type":"ContainerDied","Data":"ba815ef758455e238fed76b31d1a6d4aaef1ab82a0635e4498de556f5a0ec6ed"} Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.977896 4822 generic.go:334] "Generic (PLEG): container finished" podID="3ab26663-d9d5-425f-9eee-36df23b8ce23" containerID="ba815ef758455e238fed76b31d1a6d4aaef1ab82a0635e4498de556f5a0ec6ed" exitCode=0 Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.982275 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="6be1afa435827691537f13fe12c13c00ff6d1428cd681ee10e85bde839016aa2" exitCode=0 Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.982345 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"6be1afa435827691537f13fe12c13c00ff6d1428cd681ee10e85bde839016aa2"} Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.982869 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4"} Oct 10 07:56:31 crc kubenswrapper[4822]: I1010 07:56:31.983001 4822 scope.go:117] "RemoveContainer" containerID="7de18b7e72f602982f9f8fb2986ccae1d3cd69b584e7fa2886fd814cca7402d7" Oct 10 07:56:32 crc kubenswrapper[4822]: I1010 07:56:32.005014 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" podStartSLOduration=3.004986073 podStartE2EDuration="3.004986073s" podCreationTimestamp="2025-10-10 07:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:56:31.999654879 +0000 UTC m=+5539.094813125" watchObservedRunningTime="2025-10-10 07:56:32.004986073 +0000 UTC m=+5539.100144279" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.351997 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.362871 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-scripts\") pod \"3ab26663-d9d5-425f-9eee-36df23b8ce23\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.362966 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab26663-d9d5-425f-9eee-36df23b8ce23-logs\") pod \"3ab26663-d9d5-425f-9eee-36df23b8ce23\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.363092 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-config-data\") pod \"3ab26663-d9d5-425f-9eee-36df23b8ce23\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.363144 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr47w\" (UniqueName: \"kubernetes.io/projected/3ab26663-d9d5-425f-9eee-36df23b8ce23-kube-api-access-gr47w\") pod \"3ab26663-d9d5-425f-9eee-36df23b8ce23\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.363225 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-combined-ca-bundle\") pod \"3ab26663-d9d5-425f-9eee-36df23b8ce23\" (UID: \"3ab26663-d9d5-425f-9eee-36df23b8ce23\") " Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.365147 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ab26663-d9d5-425f-9eee-36df23b8ce23-logs" (OuterVolumeSpecName: "logs") pod "3ab26663-d9d5-425f-9eee-36df23b8ce23" (UID: "3ab26663-d9d5-425f-9eee-36df23b8ce23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.373644 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-scripts" (OuterVolumeSpecName: "scripts") pod "3ab26663-d9d5-425f-9eee-36df23b8ce23" (UID: "3ab26663-d9d5-425f-9eee-36df23b8ce23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.375551 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab26663-d9d5-425f-9eee-36df23b8ce23-kube-api-access-gr47w" (OuterVolumeSpecName: "kube-api-access-gr47w") pod "3ab26663-d9d5-425f-9eee-36df23b8ce23" (UID: "3ab26663-d9d5-425f-9eee-36df23b8ce23"). InnerVolumeSpecName "kube-api-access-gr47w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.395966 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-config-data" (OuterVolumeSpecName: "config-data") pod "3ab26663-d9d5-425f-9eee-36df23b8ce23" (UID: "3ab26663-d9d5-425f-9eee-36df23b8ce23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.414642 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ab26663-d9d5-425f-9eee-36df23b8ce23" (UID: "3ab26663-d9d5-425f-9eee-36df23b8ce23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.464678 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.464713 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr47w\" (UniqueName: \"kubernetes.io/projected/3ab26663-d9d5-425f-9eee-36df23b8ce23-kube-api-access-gr47w\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.464727 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.464738 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ab26663-d9d5-425f-9eee-36df23b8ce23-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:33 crc kubenswrapper[4822]: I1010 07:56:33.464752 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ab26663-d9d5-425f-9eee-36df23b8ce23-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.004877 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rxxnx" event={"ID":"3ab26663-d9d5-425f-9eee-36df23b8ce23","Type":"ContainerDied","Data":"39759b3c26d06ca740b0513b1469252d60a215fff66715815a7ee41303ee8095"} Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.004919 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39759b3c26d06ca740b0513b1469252d60a215fff66715815a7ee41303ee8095" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.004920 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rxxnx" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.094738 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6c56b55c6d-mgk7x"] Oct 10 07:56:34 crc kubenswrapper[4822]: E1010 07:56:34.095422 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab26663-d9d5-425f-9eee-36df23b8ce23" containerName="placement-db-sync" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.095525 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab26663-d9d5-425f-9eee-36df23b8ce23" containerName="placement-db-sync" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.095809 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab26663-d9d5-425f-9eee-36df23b8ce23" containerName="placement-db-sync" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.096779 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.098860 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zpxrn" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.098955 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.099608 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.118786 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c56b55c6d-mgk7x"] Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.202353 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-scripts\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.203072 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6cg5\" (UniqueName: \"kubernetes.io/projected/1dc18be1-e71e-4d36-ba14-39c872d97771-kube-api-access-b6cg5\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.203190 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-config-data\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.203223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc18be1-e71e-4d36-ba14-39c872d97771-logs\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.203462 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-combined-ca-bundle\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.305090 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-config-data\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.305194 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc18be1-e71e-4d36-ba14-39c872d97771-logs\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.305223 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-combined-ca-bundle\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.305506 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-scripts\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.305940 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc18be1-e71e-4d36-ba14-39c872d97771-logs\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.306048 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6cg5\" (UniqueName: \"kubernetes.io/projected/1dc18be1-e71e-4d36-ba14-39c872d97771-kube-api-access-b6cg5\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.309469 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-combined-ca-bundle\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.309933 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-scripts\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.310483 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc18be1-e71e-4d36-ba14-39c872d97771-config-data\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.323564 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6cg5\" (UniqueName: \"kubernetes.io/projected/1dc18be1-e71e-4d36-ba14-39c872d97771-kube-api-access-b6cg5\") pod \"placement-6c56b55c6d-mgk7x\" (UID: \"1dc18be1-e71e-4d36-ba14-39c872d97771\") " pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.415088 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:34 crc kubenswrapper[4822]: I1010 07:56:34.944624 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c56b55c6d-mgk7x"] Oct 10 07:56:34 crc kubenswrapper[4822]: W1010 07:56:34.949222 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dc18be1_e71e_4d36_ba14_39c872d97771.slice/crio-668852f28efc14ce395d7572e9d37b2a7cb6d3761aadc6754d573f57f84d379a WatchSource:0}: Error finding container 668852f28efc14ce395d7572e9d37b2a7cb6d3761aadc6754d573f57f84d379a: Status 404 returned error can't find the container with id 668852f28efc14ce395d7572e9d37b2a7cb6d3761aadc6754d573f57f84d379a Oct 10 07:56:35 crc kubenswrapper[4822]: I1010 07:56:35.023506 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c56b55c6d-mgk7x" event={"ID":"1dc18be1-e71e-4d36-ba14-39c872d97771","Type":"ContainerStarted","Data":"668852f28efc14ce395d7572e9d37b2a7cb6d3761aadc6754d573f57f84d379a"} Oct 10 07:56:36 crc kubenswrapper[4822]: I1010 07:56:36.037934 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c56b55c6d-mgk7x" event={"ID":"1dc18be1-e71e-4d36-ba14-39c872d97771","Type":"ContainerStarted","Data":"01492ec8b00d54cddf5c8f70de017036539a45d6a4d7d0a4175423c1bf5b81ee"} Oct 10 07:56:36 crc kubenswrapper[4822]: I1010 07:56:36.038531 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c56b55c6d-mgk7x" event={"ID":"1dc18be1-e71e-4d36-ba14-39c872d97771","Type":"ContainerStarted","Data":"f438354c61ea4f31f512fcd95e757dcdaf6e8d593ee9723a69c58e7c0d92243c"} Oct 10 07:56:36 crc kubenswrapper[4822]: I1010 07:56:36.038554 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:36 crc kubenswrapper[4822]: I1010 07:56:36.038565 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:56:36 crc kubenswrapper[4822]: I1010 07:56:36.064969 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6c56b55c6d-mgk7x" podStartSLOduration=2.064946779 podStartE2EDuration="2.064946779s" podCreationTimestamp="2025-10-10 07:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:56:36.056444144 +0000 UTC m=+5543.151602360" watchObservedRunningTime="2025-10-10 07:56:36.064946779 +0000 UTC m=+5543.160104995" Oct 10 07:56:39 crc kubenswrapper[4822]: I1010 07:56:39.580176 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:56:39 crc kubenswrapper[4822]: I1010 07:56:39.673368 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7468c8c7-zph7v"] Oct 10 07:56:39 crc kubenswrapper[4822]: I1010 07:56:39.673635 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerName="dnsmasq-dns" containerID="cri-o://29ace8d0d9db8a4c779fe59823d3232b89eedf7b4cf3542178e8bb889c20b669" gracePeriod=10 Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.078370 4822 generic.go:334] "Generic (PLEG): container finished" podID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerID="29ace8d0d9db8a4c779fe59823d3232b89eedf7b4cf3542178e8bb889c20b669" exitCode=0 Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.078458 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" event={"ID":"55e8047f-1c20-4c8c-a1a4-a7314c7ce152","Type":"ContainerDied","Data":"29ace8d0d9db8a4c779fe59823d3232b89eedf7b4cf3542178e8bb889c20b669"} Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.078696 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" event={"ID":"55e8047f-1c20-4c8c-a1a4-a7314c7ce152","Type":"ContainerDied","Data":"a39bb0d8e17040c080ffef332db05e292d408d96a30e404c525a524e8f74a5a7"} Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.078712 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a39bb0d8e17040c080ffef332db05e292d408d96a30e404c525a524e8f74a5a7" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.126936 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.216458 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-nb\") pod \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.216509 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-dns-svc\") pod \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.216630 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsm6m\" (UniqueName: \"kubernetes.io/projected/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-kube-api-access-gsm6m\") pod \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.216710 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-config\") pod \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.216766 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-sb\") pod \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\" (UID: \"55e8047f-1c20-4c8c-a1a4-a7314c7ce152\") " Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.225334 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-kube-api-access-gsm6m" (OuterVolumeSpecName: "kube-api-access-gsm6m") pod "55e8047f-1c20-4c8c-a1a4-a7314c7ce152" (UID: "55e8047f-1c20-4c8c-a1a4-a7314c7ce152"). InnerVolumeSpecName "kube-api-access-gsm6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.264422 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55e8047f-1c20-4c8c-a1a4-a7314c7ce152" (UID: "55e8047f-1c20-4c8c-a1a4-a7314c7ce152"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.264436 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55e8047f-1c20-4c8c-a1a4-a7314c7ce152" (UID: "55e8047f-1c20-4c8c-a1a4-a7314c7ce152"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.264574 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55e8047f-1c20-4c8c-a1a4-a7314c7ce152" (UID: "55e8047f-1c20-4c8c-a1a4-a7314c7ce152"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.267101 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-config" (OuterVolumeSpecName: "config") pod "55e8047f-1c20-4c8c-a1a4-a7314c7ce152" (UID: "55e8047f-1c20-4c8c-a1a4-a7314c7ce152"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.318122 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.318157 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.318169 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.318176 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:40 crc kubenswrapper[4822]: I1010 07:56:40.318187 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsm6m\" (UniqueName: \"kubernetes.io/projected/55e8047f-1c20-4c8c-a1a4-a7314c7ce152-kube-api-access-gsm6m\") on node \"crc\" DevicePath \"\"" Oct 10 07:56:41 crc kubenswrapper[4822]: I1010 07:56:41.087320 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7468c8c7-zph7v" Oct 10 07:56:41 crc kubenswrapper[4822]: I1010 07:56:41.123874 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7468c8c7-zph7v"] Oct 10 07:56:41 crc kubenswrapper[4822]: I1010 07:56:41.143600 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f7468c8c7-zph7v"] Oct 10 07:56:41 crc kubenswrapper[4822]: I1010 07:56:41.667287 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" path="/var/lib/kubelet/pods/55e8047f-1c20-4c8c-a1a4-a7314c7ce152/volumes" Oct 10 07:57:05 crc kubenswrapper[4822]: I1010 07:57:05.481993 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:57:05 crc kubenswrapper[4822]: I1010 07:57:05.508547 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c56b55c6d-mgk7x" Oct 10 07:57:23 crc kubenswrapper[4822]: I1010 07:57:23.569186 4822 scope.go:117] "RemoveContainer" containerID="27206f97264088f99cacbdf8b70d80b2a3e96d441b9b61300877741e6d97930b" Oct 10 07:57:23 crc kubenswrapper[4822]: I1010 07:57:23.624178 4822 scope.go:117] "RemoveContainer" containerID="db680deb468d8342ad9710cb93cb05ce7366b21663f39ca6cb8498a28e1f007e" Oct 10 07:57:23 crc kubenswrapper[4822]: I1010 07:57:23.651415 4822 scope.go:117] "RemoveContainer" containerID="a671c82969259ce74e8ff03edfed8a217d91a3ca469284a9ca759d669a88c8a7" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.223628 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-j4j2s"] Oct 10 07:57:29 crc kubenswrapper[4822]: E1010 07:57:29.224368 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerName="init" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.224382 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerName="init" Oct 10 07:57:29 crc kubenswrapper[4822]: E1010 07:57:29.224406 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerName="dnsmasq-dns" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.224412 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerName="dnsmasq-dns" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.227933 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e8047f-1c20-4c8c-a1a4-a7314c7ce152" containerName="dnsmasq-dns" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.228697 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.231343 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j4j2s"] Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.319559 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96brn\" (UniqueName: \"kubernetes.io/projected/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284-kube-api-access-96brn\") pod \"nova-api-db-create-j4j2s\" (UID: \"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284\") " pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.327519 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7vbkg"] Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.328555 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.338022 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7vbkg"] Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.421089 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96brn\" (UniqueName: \"kubernetes.io/projected/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284-kube-api-access-96brn\") pod \"nova-api-db-create-j4j2s\" (UID: \"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284\") " pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.421195 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/04b97456-2a65-4982-9b6c-c8c8557588d3-kube-api-access-srxdh\") pod \"nova-cell0-db-create-7vbkg\" (UID: \"04b97456-2a65-4982-9b6c-c8c8557588d3\") " pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.423799 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8qlfm"] Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.425366 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.434101 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8qlfm"] Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.445404 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96brn\" (UniqueName: \"kubernetes.io/projected/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284-kube-api-access-96brn\") pod \"nova-api-db-create-j4j2s\" (UID: \"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284\") " pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.523183 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/04b97456-2a65-4982-9b6c-c8c8557588d3-kube-api-access-srxdh\") pod \"nova-cell0-db-create-7vbkg\" (UID: \"04b97456-2a65-4982-9b6c-c8c8557588d3\") " pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.523567 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmrb\" (UniqueName: \"kubernetes.io/projected/f15f4fd3-e407-48bc-921e-181dcc8b9be7-kube-api-access-frmrb\") pod \"nova-cell1-db-create-8qlfm\" (UID: \"f15f4fd3-e407-48bc-921e-181dcc8b9be7\") " pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.544452 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/04b97456-2a65-4982-9b6c-c8c8557588d3-kube-api-access-srxdh\") pod \"nova-cell0-db-create-7vbkg\" (UID: \"04b97456-2a65-4982-9b6c-c8c8557588d3\") " pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.550661 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.625027 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmrb\" (UniqueName: \"kubernetes.io/projected/f15f4fd3-e407-48bc-921e-181dcc8b9be7-kube-api-access-frmrb\") pod \"nova-cell1-db-create-8qlfm\" (UID: \"f15f4fd3-e407-48bc-921e-181dcc8b9be7\") " pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.640251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmrb\" (UniqueName: \"kubernetes.io/projected/f15f4fd3-e407-48bc-921e-181dcc8b9be7-kube-api-access-frmrb\") pod \"nova-cell1-db-create-8qlfm\" (UID: \"f15f4fd3-e407-48bc-921e-181dcc8b9be7\") " pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.642251 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:29 crc kubenswrapper[4822]: I1010 07:57:29.742904 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.051684 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j4j2s"] Oct 10 07:57:30 crc kubenswrapper[4822]: W1010 07:57:30.060834 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c00ec59_c1e4_4cd1_817a_ebff4fcd2284.slice/crio-33646dbe0f83e26a7107b4fc3a5e585315e2a2b6eb823e2229f02177365c5661 WatchSource:0}: Error finding container 33646dbe0f83e26a7107b4fc3a5e585315e2a2b6eb823e2229f02177365c5661: Status 404 returned error can't find the container with id 33646dbe0f83e26a7107b4fc3a5e585315e2a2b6eb823e2229f02177365c5661 Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.154638 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7vbkg"] Oct 10 07:57:30 crc kubenswrapper[4822]: W1010 07:57:30.163676 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b97456_2a65_4982_9b6c_c8c8557588d3.slice/crio-cebbd215f2734eb058cbbcf06dc6cccf96af7f15f6b04120d8482818f1969998 WatchSource:0}: Error finding container cebbd215f2734eb058cbbcf06dc6cccf96af7f15f6b04120d8482818f1969998: Status 404 returned error can't find the container with id cebbd215f2734eb058cbbcf06dc6cccf96af7f15f6b04120d8482818f1969998 Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.233665 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8qlfm"] Oct 10 07:57:30 crc kubenswrapper[4822]: W1010 07:57:30.238104 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15f4fd3_e407_48bc_921e_181dcc8b9be7.slice/crio-e1a56c43b9f7d7b14faaca39f0e28df43a3c953c383e9e52d0a8f21a28d34ae7 WatchSource:0}: Error finding container e1a56c43b9f7d7b14faaca39f0e28df43a3c953c383e9e52d0a8f21a28d34ae7: Status 404 returned error can't find the container with id e1a56c43b9f7d7b14faaca39f0e28df43a3c953c383e9e52d0a8f21a28d34ae7 Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.596440 4822 generic.go:334] "Generic (PLEG): container finished" podID="4c00ec59-c1e4-4cd1-817a-ebff4fcd2284" containerID="2e07c28ad20d05c5e64db0fe2e34034b4411b952d55e45cba191f43ce01c7a1b" exitCode=0 Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.596491 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4j2s" event={"ID":"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284","Type":"ContainerDied","Data":"2e07c28ad20d05c5e64db0fe2e34034b4411b952d55e45cba191f43ce01c7a1b"} Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.596532 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4j2s" event={"ID":"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284","Type":"ContainerStarted","Data":"33646dbe0f83e26a7107b4fc3a5e585315e2a2b6eb823e2229f02177365c5661"} Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.598834 4822 generic.go:334] "Generic (PLEG): container finished" podID="04b97456-2a65-4982-9b6c-c8c8557588d3" containerID="8b1c79468a6144e4550065b536d488eba6c7caee18b65d8a4ba663cd6eecddc1" exitCode=0 Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.598922 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7vbkg" event={"ID":"04b97456-2a65-4982-9b6c-c8c8557588d3","Type":"ContainerDied","Data":"8b1c79468a6144e4550065b536d488eba6c7caee18b65d8a4ba663cd6eecddc1"} Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.598947 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7vbkg" event={"ID":"04b97456-2a65-4982-9b6c-c8c8557588d3","Type":"ContainerStarted","Data":"cebbd215f2734eb058cbbcf06dc6cccf96af7f15f6b04120d8482818f1969998"} Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.600520 4822 generic.go:334] "Generic (PLEG): container finished" podID="f15f4fd3-e407-48bc-921e-181dcc8b9be7" containerID="df039484ed1645dd1bd960c601929ef9ecf2a3910b85aeee146870fb07199795" exitCode=0 Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.600547 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8qlfm" event={"ID":"f15f4fd3-e407-48bc-921e-181dcc8b9be7","Type":"ContainerDied","Data":"df039484ed1645dd1bd960c601929ef9ecf2a3910b85aeee146870fb07199795"} Oct 10 07:57:30 crc kubenswrapper[4822]: I1010 07:57:30.600568 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8qlfm" event={"ID":"f15f4fd3-e407-48bc-921e-181dcc8b9be7","Type":"ContainerStarted","Data":"e1a56c43b9f7d7b14faaca39f0e28df43a3c953c383e9e52d0a8f21a28d34ae7"} Oct 10 07:57:31 crc kubenswrapper[4822]: I1010 07:57:31.975865 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.069423 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.080585 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.098364 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frmrb\" (UniqueName: \"kubernetes.io/projected/f15f4fd3-e407-48bc-921e-181dcc8b9be7-kube-api-access-frmrb\") pod \"f15f4fd3-e407-48bc-921e-181dcc8b9be7\" (UID: \"f15f4fd3-e407-48bc-921e-181dcc8b9be7\") " Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.106569 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15f4fd3-e407-48bc-921e-181dcc8b9be7-kube-api-access-frmrb" (OuterVolumeSpecName: "kube-api-access-frmrb") pod "f15f4fd3-e407-48bc-921e-181dcc8b9be7" (UID: "f15f4fd3-e407-48bc-921e-181dcc8b9be7"). InnerVolumeSpecName "kube-api-access-frmrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.199509 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96brn\" (UniqueName: \"kubernetes.io/projected/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284-kube-api-access-96brn\") pod \"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284\" (UID: \"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284\") " Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.199613 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/04b97456-2a65-4982-9b6c-c8c8557588d3-kube-api-access-srxdh\") pod \"04b97456-2a65-4982-9b6c-c8c8557588d3\" (UID: \"04b97456-2a65-4982-9b6c-c8c8557588d3\") " Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.200505 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frmrb\" (UniqueName: \"kubernetes.io/projected/f15f4fd3-e407-48bc-921e-181dcc8b9be7-kube-api-access-frmrb\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.203038 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284-kube-api-access-96brn" (OuterVolumeSpecName: "kube-api-access-96brn") pod "4c00ec59-c1e4-4cd1-817a-ebff4fcd2284" (UID: "4c00ec59-c1e4-4cd1-817a-ebff4fcd2284"). InnerVolumeSpecName "kube-api-access-96brn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.203124 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b97456-2a65-4982-9b6c-c8c8557588d3-kube-api-access-srxdh" (OuterVolumeSpecName: "kube-api-access-srxdh") pod "04b97456-2a65-4982-9b6c-c8c8557588d3" (UID: "04b97456-2a65-4982-9b6c-c8c8557588d3"). InnerVolumeSpecName "kube-api-access-srxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.302225 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96brn\" (UniqueName: \"kubernetes.io/projected/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284-kube-api-access-96brn\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.302253 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/04b97456-2a65-4982-9b6c-c8c8557588d3-kube-api-access-srxdh\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.623279 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8qlfm" event={"ID":"f15f4fd3-e407-48bc-921e-181dcc8b9be7","Type":"ContainerDied","Data":"e1a56c43b9f7d7b14faaca39f0e28df43a3c953c383e9e52d0a8f21a28d34ae7"} Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.623352 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a56c43b9f7d7b14faaca39f0e28df43a3c953c383e9e52d0a8f21a28d34ae7" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.623316 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8qlfm" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.626031 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4j2s" event={"ID":"4c00ec59-c1e4-4cd1-817a-ebff4fcd2284","Type":"ContainerDied","Data":"33646dbe0f83e26a7107b4fc3a5e585315e2a2b6eb823e2229f02177365c5661"} Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.626089 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33646dbe0f83e26a7107b4fc3a5e585315e2a2b6eb823e2229f02177365c5661" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.626096 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4j2s" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.629383 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7vbkg" event={"ID":"04b97456-2a65-4982-9b6c-c8c8557588d3","Type":"ContainerDied","Data":"cebbd215f2734eb058cbbcf06dc6cccf96af7f15f6b04120d8482818f1969998"} Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.629458 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cebbd215f2734eb058cbbcf06dc6cccf96af7f15f6b04120d8482818f1969998" Oct 10 07:57:32 crc kubenswrapper[4822]: I1010 07:57:32.629459 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7vbkg" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.463148 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2e5e-account-create-xkp95"] Oct 10 07:57:39 crc kubenswrapper[4822]: E1010 07:57:39.464069 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b97456-2a65-4982-9b6c-c8c8557588d3" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464081 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b97456-2a65-4982-9b6c-c8c8557588d3" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: E1010 07:57:39.464107 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15f4fd3-e407-48bc-921e-181dcc8b9be7" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464113 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15f4fd3-e407-48bc-921e-181dcc8b9be7" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: E1010 07:57:39.464128 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c00ec59-c1e4-4cd1-817a-ebff4fcd2284" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464136 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c00ec59-c1e4-4cd1-817a-ebff4fcd2284" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464298 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b97456-2a65-4982-9b6c-c8c8557588d3" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464320 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c00ec59-c1e4-4cd1-817a-ebff4fcd2284" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464330 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15f4fd3-e407-48bc-921e-181dcc8b9be7" containerName="mariadb-database-create" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.464873 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.466935 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.482050 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2e5e-account-create-xkp95"] Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.544027 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xn2g\" (UniqueName: \"kubernetes.io/projected/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7-kube-api-access-9xn2g\") pod \"nova-api-2e5e-account-create-xkp95\" (UID: \"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7\") " pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.655836 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xn2g\" (UniqueName: \"kubernetes.io/projected/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7-kube-api-access-9xn2g\") pod \"nova-api-2e5e-account-create-xkp95\" (UID: \"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7\") " pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.698901 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xn2g\" (UniqueName: \"kubernetes.io/projected/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7-kube-api-access-9xn2g\") pod \"nova-api-2e5e-account-create-xkp95\" (UID: \"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7\") " pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.699754 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f4e8-account-create-s5f9z"] Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.701660 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f4e8-account-create-s5f9z"] Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.701759 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.704681 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.758345 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg6x\" (UniqueName: \"kubernetes.io/projected/a3d4b476-bea7-469c-9cf2-2b50e729ab7b-kube-api-access-zlg6x\") pod \"nova-cell0-f4e8-account-create-s5f9z\" (UID: \"a3d4b476-bea7-469c-9cf2-2b50e729ab7b\") " pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.791259 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.855850 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-eaaf-account-create-qdkvx"] Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.857110 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.859665 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.862313 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg6x\" (UniqueName: \"kubernetes.io/projected/a3d4b476-bea7-469c-9cf2-2b50e729ab7b-kube-api-access-zlg6x\") pod \"nova-cell0-f4e8-account-create-s5f9z\" (UID: \"a3d4b476-bea7-469c-9cf2-2b50e729ab7b\") " pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.867203 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eaaf-account-create-qdkvx"] Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.884690 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg6x\" (UniqueName: \"kubernetes.io/projected/a3d4b476-bea7-469c-9cf2-2b50e729ab7b-kube-api-access-zlg6x\") pod \"nova-cell0-f4e8-account-create-s5f9z\" (UID: \"a3d4b476-bea7-469c-9cf2-2b50e729ab7b\") " pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:39 crc kubenswrapper[4822]: I1010 07:57:39.964449 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rx6\" (UniqueName: \"kubernetes.io/projected/e8a355f7-9246-47ac-9524-15e54066c591-kube-api-access-t4rx6\") pod \"nova-cell1-eaaf-account-create-qdkvx\" (UID: \"e8a355f7-9246-47ac-9524-15e54066c591\") " pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.065591 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rx6\" (UniqueName: \"kubernetes.io/projected/e8a355f7-9246-47ac-9524-15e54066c591-kube-api-access-t4rx6\") pod \"nova-cell1-eaaf-account-create-qdkvx\" (UID: \"e8a355f7-9246-47ac-9524-15e54066c591\") " pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.066224 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.084923 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rx6\" (UniqueName: \"kubernetes.io/projected/e8a355f7-9246-47ac-9524-15e54066c591-kube-api-access-t4rx6\") pod \"nova-cell1-eaaf-account-create-qdkvx\" (UID: \"e8a355f7-9246-47ac-9524-15e54066c591\") " pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.248838 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.305202 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2e5e-account-create-xkp95"] Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.578523 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f4e8-account-create-s5f9z"] Oct 10 07:57:40 crc kubenswrapper[4822]: W1010 07:57:40.622001 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d4b476_bea7_469c_9cf2_2b50e729ab7b.slice/crio-f127b47ffffda3f9698fd553b2b28c5929f05eb2065064bf79be896c5db3d882 WatchSource:0}: Error finding container f127b47ffffda3f9698fd553b2b28c5929f05eb2065064bf79be896c5db3d882: Status 404 returned error can't find the container with id f127b47ffffda3f9698fd553b2b28c5929f05eb2065064bf79be896c5db3d882 Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.736382 4822 generic.go:334] "Generic (PLEG): container finished" podID="3e24cfb7-bc5a-4b10-8b4f-a4df917256f7" containerID="7e0712dae4e893ae4487877923c750d2d3b120c6e4cc787b8b025a7b8df38942" exitCode=0 Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.736453 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2e5e-account-create-xkp95" event={"ID":"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7","Type":"ContainerDied","Data":"7e0712dae4e893ae4487877923c750d2d3b120c6e4cc787b8b025a7b8df38942"} Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.736479 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2e5e-account-create-xkp95" event={"ID":"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7","Type":"ContainerStarted","Data":"98dfb35e179d7097b8b81b81ec6007e678f0cadc5356e5f09d596d1ce92ad127"} Oct 10 07:57:40 crc kubenswrapper[4822]: W1010 07:57:40.738650 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8a355f7_9246_47ac_9524_15e54066c591.slice/crio-d43977094973dd36ef27f3bb48984cb6fe26bc21e92a512bae921a4ada10042d WatchSource:0}: Error finding container d43977094973dd36ef27f3bb48984cb6fe26bc21e92a512bae921a4ada10042d: Status 404 returned error can't find the container with id d43977094973dd36ef27f3bb48984cb6fe26bc21e92a512bae921a4ada10042d Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.739747 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eaaf-account-create-qdkvx"] Oct 10 07:57:40 crc kubenswrapper[4822]: I1010 07:57:40.740440 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" event={"ID":"a3d4b476-bea7-469c-9cf2-2b50e729ab7b","Type":"ContainerStarted","Data":"f127b47ffffda3f9698fd553b2b28c5929f05eb2065064bf79be896c5db3d882"} Oct 10 07:57:41 crc kubenswrapper[4822]: I1010 07:57:41.752406 4822 generic.go:334] "Generic (PLEG): container finished" podID="a3d4b476-bea7-469c-9cf2-2b50e729ab7b" containerID="fe189be58c662505c424407541f1be96d092727f14a14af1ad08a9ad47f129fa" exitCode=0 Oct 10 07:57:41 crc kubenswrapper[4822]: I1010 07:57:41.752445 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" event={"ID":"a3d4b476-bea7-469c-9cf2-2b50e729ab7b","Type":"ContainerDied","Data":"fe189be58c662505c424407541f1be96d092727f14a14af1ad08a9ad47f129fa"} Oct 10 07:57:41 crc kubenswrapper[4822]: I1010 07:57:41.755922 4822 generic.go:334] "Generic (PLEG): container finished" podID="e8a355f7-9246-47ac-9524-15e54066c591" containerID="4676f672928c1465d6cbbd7ea59a90a8716372df09d663ea5ea45bdc9b07e188" exitCode=0 Oct 10 07:57:41 crc kubenswrapper[4822]: I1010 07:57:41.755987 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" event={"ID":"e8a355f7-9246-47ac-9524-15e54066c591","Type":"ContainerDied","Data":"4676f672928c1465d6cbbd7ea59a90a8716372df09d663ea5ea45bdc9b07e188"} Oct 10 07:57:41 crc kubenswrapper[4822]: I1010 07:57:41.756021 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" event={"ID":"e8a355f7-9246-47ac-9524-15e54066c591","Type":"ContainerStarted","Data":"d43977094973dd36ef27f3bb48984cb6fe26bc21e92a512bae921a4ada10042d"} Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.145949 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.209218 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xn2g\" (UniqueName: \"kubernetes.io/projected/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7-kube-api-access-9xn2g\") pod \"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7\" (UID: \"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7\") " Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.214843 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7-kube-api-access-9xn2g" (OuterVolumeSpecName: "kube-api-access-9xn2g") pod "3e24cfb7-bc5a-4b10-8b4f-a4df917256f7" (UID: "3e24cfb7-bc5a-4b10-8b4f-a4df917256f7"). InnerVolumeSpecName "kube-api-access-9xn2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.310699 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xn2g\" (UniqueName: \"kubernetes.io/projected/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7-kube-api-access-9xn2g\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.767697 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e5e-account-create-xkp95" Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.770195 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2e5e-account-create-xkp95" event={"ID":"3e24cfb7-bc5a-4b10-8b4f-a4df917256f7","Type":"ContainerDied","Data":"98dfb35e179d7097b8b81b81ec6007e678f0cadc5356e5f09d596d1ce92ad127"} Oct 10 07:57:42 crc kubenswrapper[4822]: I1010 07:57:42.770276 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98dfb35e179d7097b8b81b81ec6007e678f0cadc5356e5f09d596d1ce92ad127" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.218796 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.225577 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.233924 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9kx2k"] Oct 10 07:57:43 crc kubenswrapper[4822]: E1010 07:57:43.234624 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e24cfb7-bc5a-4b10-8b4f-a4df917256f7" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.234649 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e24cfb7-bc5a-4b10-8b4f-a4df917256f7" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: E1010 07:57:43.234662 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a355f7-9246-47ac-9524-15e54066c591" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.234670 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a355f7-9246-47ac-9524-15e54066c591" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: E1010 07:57:43.234695 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d4b476-bea7-469c-9cf2-2b50e729ab7b" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.234701 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d4b476-bea7-469c-9cf2-2b50e729ab7b" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.235108 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e24cfb7-bc5a-4b10-8b4f-a4df917256f7" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.235140 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d4b476-bea7-469c-9cf2-2b50e729ab7b" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.235156 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a355f7-9246-47ac-9524-15e54066c591" containerName="mariadb-account-create" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.236975 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.251772 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kx2k"] Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.332860 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlg6x\" (UniqueName: \"kubernetes.io/projected/a3d4b476-bea7-469c-9cf2-2b50e729ab7b-kube-api-access-zlg6x\") pod \"a3d4b476-bea7-469c-9cf2-2b50e729ab7b\" (UID: \"a3d4b476-bea7-469c-9cf2-2b50e729ab7b\") " Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.333195 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4rx6\" (UniqueName: \"kubernetes.io/projected/e8a355f7-9246-47ac-9524-15e54066c591-kube-api-access-t4rx6\") pod \"e8a355f7-9246-47ac-9524-15e54066c591\" (UID: \"e8a355f7-9246-47ac-9524-15e54066c591\") " Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.333589 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-catalog-content\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.333637 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-utilities\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.333674 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm54d\" (UniqueName: \"kubernetes.io/projected/a8757d75-bb34-40d3-afa6-d6e1f33be10b-kube-api-access-rm54d\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.338509 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d4b476-bea7-469c-9cf2-2b50e729ab7b-kube-api-access-zlg6x" (OuterVolumeSpecName: "kube-api-access-zlg6x") pod "a3d4b476-bea7-469c-9cf2-2b50e729ab7b" (UID: "a3d4b476-bea7-469c-9cf2-2b50e729ab7b"). InnerVolumeSpecName "kube-api-access-zlg6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.339091 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a355f7-9246-47ac-9524-15e54066c591-kube-api-access-t4rx6" (OuterVolumeSpecName: "kube-api-access-t4rx6") pod "e8a355f7-9246-47ac-9524-15e54066c591" (UID: "e8a355f7-9246-47ac-9524-15e54066c591"). InnerVolumeSpecName "kube-api-access-t4rx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.436317 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-catalog-content\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.436434 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-utilities\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.436473 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm54d\" (UniqueName: \"kubernetes.io/projected/a8757d75-bb34-40d3-afa6-d6e1f33be10b-kube-api-access-rm54d\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.436649 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4rx6\" (UniqueName: \"kubernetes.io/projected/e8a355f7-9246-47ac-9524-15e54066c591-kube-api-access-t4rx6\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.436666 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlg6x\" (UniqueName: \"kubernetes.io/projected/a3d4b476-bea7-469c-9cf2-2b50e729ab7b-kube-api-access-zlg6x\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.437546 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-catalog-content\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.437596 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-utilities\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.452878 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm54d\" (UniqueName: \"kubernetes.io/projected/a8757d75-bb34-40d3-afa6-d6e1f33be10b-kube-api-access-rm54d\") pod \"certified-operators-9kx2k\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.569881 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.782090 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" event={"ID":"a3d4b476-bea7-469c-9cf2-2b50e729ab7b","Type":"ContainerDied","Data":"f127b47ffffda3f9698fd553b2b28c5929f05eb2065064bf79be896c5db3d882"} Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.782449 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f127b47ffffda3f9698fd553b2b28c5929f05eb2065064bf79be896c5db3d882" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.782516 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f4e8-account-create-s5f9z" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.791714 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" event={"ID":"e8a355f7-9246-47ac-9524-15e54066c591","Type":"ContainerDied","Data":"d43977094973dd36ef27f3bb48984cb6fe26bc21e92a512bae921a4ada10042d"} Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.791761 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43977094973dd36ef27f3bb48984cb6fe26bc21e92a512bae921a4ada10042d" Oct 10 07:57:43 crc kubenswrapper[4822]: I1010 07:57:43.791808 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eaaf-account-create-qdkvx" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.068267 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kx2k"] Oct 10 07:57:44 crc kubenswrapper[4822]: W1010 07:57:44.071007 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8757d75_bb34_40d3_afa6_d6e1f33be10b.slice/crio-2b7fc022c885b8c05989297a25db28e2148e5b1764b54f5af0f5ffdc4fbfcba8 WatchSource:0}: Error finding container 2b7fc022c885b8c05989297a25db28e2148e5b1764b54f5af0f5ffdc4fbfcba8: Status 404 returned error can't find the container with id 2b7fc022c885b8c05989297a25db28e2148e5b1764b54f5af0f5ffdc4fbfcba8 Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.801780 4822 generic.go:334] "Generic (PLEG): container finished" podID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerID="af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e" exitCode=0 Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.801856 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerDied","Data":"af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e"} Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.802047 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerStarted","Data":"2b7fc022c885b8c05989297a25db28e2148e5b1764b54f5af0f5ffdc4fbfcba8"} Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.804061 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.930666 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmtt7"] Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.932355 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.934584 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.934918 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d2wrj" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.944085 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.953688 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmtt7"] Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.964397 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-config-data\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.964568 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqlpg\" (UniqueName: \"kubernetes.io/projected/54ae4f6b-1cd9-497b-9665-25b570793997-kube-api-access-nqlpg\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.964594 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-scripts\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:44 crc kubenswrapper[4822]: I1010 07:57:44.964627 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.065905 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqlpg\" (UniqueName: \"kubernetes.io/projected/54ae4f6b-1cd9-497b-9665-25b570793997-kube-api-access-nqlpg\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.065965 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-scripts\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.066000 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.066018 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-config-data\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.077822 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.086108 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqlpg\" (UniqueName: \"kubernetes.io/projected/54ae4f6b-1cd9-497b-9665-25b570793997-kube-api-access-nqlpg\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.088870 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-scripts\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.107843 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-config-data\") pod \"nova-cell0-conductor-db-sync-fmtt7\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.247938 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.483134 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmtt7"] Oct 10 07:57:45 crc kubenswrapper[4822]: W1010 07:57:45.488852 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ae4f6b_1cd9_497b_9665_25b570793997.slice/crio-9997ab8a075d78565b850528fdd2afa40860f80526fbd405d7eba44aaf9a4224 WatchSource:0}: Error finding container 9997ab8a075d78565b850528fdd2afa40860f80526fbd405d7eba44aaf9a4224: Status 404 returned error can't find the container with id 9997ab8a075d78565b850528fdd2afa40860f80526fbd405d7eba44aaf9a4224 Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.813011 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerStarted","Data":"08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e"} Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.814460 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" event={"ID":"54ae4f6b-1cd9-497b-9665-25b570793997","Type":"ContainerStarted","Data":"6a9f61599a80c837ee3233ca76d0d97bc873d5ea3ad5f72c4039992b577bdf2f"} Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.814499 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" event={"ID":"54ae4f6b-1cd9-497b-9665-25b570793997","Type":"ContainerStarted","Data":"9997ab8a075d78565b850528fdd2afa40860f80526fbd405d7eba44aaf9a4224"} Oct 10 07:57:45 crc kubenswrapper[4822]: I1010 07:57:45.844117 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" podStartSLOduration=1.844096094 podStartE2EDuration="1.844096094s" podCreationTimestamp="2025-10-10 07:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:57:45.841999033 +0000 UTC m=+5612.937157239" watchObservedRunningTime="2025-10-10 07:57:45.844096094 +0000 UTC m=+5612.939254300" Oct 10 07:57:46 crc kubenswrapper[4822]: I1010 07:57:46.828764 4822 generic.go:334] "Generic (PLEG): container finished" podID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerID="08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e" exitCode=0 Oct 10 07:57:46 crc kubenswrapper[4822]: I1010 07:57:46.828990 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerDied","Data":"08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e"} Oct 10 07:57:47 crc kubenswrapper[4822]: I1010 07:57:47.839108 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerStarted","Data":"b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93"} Oct 10 07:57:47 crc kubenswrapper[4822]: I1010 07:57:47.864459 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9kx2k" podStartSLOduration=2.319413816 podStartE2EDuration="4.864435703s" podCreationTimestamp="2025-10-10 07:57:43 +0000 UTC" firstStartedPulling="2025-10-10 07:57:44.803851751 +0000 UTC m=+5611.899009947" lastFinishedPulling="2025-10-10 07:57:47.348873638 +0000 UTC m=+5614.444031834" observedRunningTime="2025-10-10 07:57:47.856594797 +0000 UTC m=+5614.951753003" watchObservedRunningTime="2025-10-10 07:57:47.864435703 +0000 UTC m=+5614.959593909" Oct 10 07:57:50 crc kubenswrapper[4822]: I1010 07:57:50.868206 4822 generic.go:334] "Generic (PLEG): container finished" podID="54ae4f6b-1cd9-497b-9665-25b570793997" containerID="6a9f61599a80c837ee3233ca76d0d97bc873d5ea3ad5f72c4039992b577bdf2f" exitCode=0 Oct 10 07:57:50 crc kubenswrapper[4822]: I1010 07:57:50.868377 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" event={"ID":"54ae4f6b-1cd9-497b-9665-25b570793997","Type":"ContainerDied","Data":"6a9f61599a80c837ee3233ca76d0d97bc873d5ea3ad5f72c4039992b577bdf2f"} Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.223678 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.319635 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-config-data\") pod \"54ae4f6b-1cd9-497b-9665-25b570793997\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.319847 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-combined-ca-bundle\") pod \"54ae4f6b-1cd9-497b-9665-25b570793997\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.319895 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqlpg\" (UniqueName: \"kubernetes.io/projected/54ae4f6b-1cd9-497b-9665-25b570793997-kube-api-access-nqlpg\") pod \"54ae4f6b-1cd9-497b-9665-25b570793997\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.320067 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-scripts\") pod \"54ae4f6b-1cd9-497b-9665-25b570793997\" (UID: \"54ae4f6b-1cd9-497b-9665-25b570793997\") " Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.325852 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-scripts" (OuterVolumeSpecName: "scripts") pod "54ae4f6b-1cd9-497b-9665-25b570793997" (UID: "54ae4f6b-1cd9-497b-9665-25b570793997"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.329042 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ae4f6b-1cd9-497b-9665-25b570793997-kube-api-access-nqlpg" (OuterVolumeSpecName: "kube-api-access-nqlpg") pod "54ae4f6b-1cd9-497b-9665-25b570793997" (UID: "54ae4f6b-1cd9-497b-9665-25b570793997"). InnerVolumeSpecName "kube-api-access-nqlpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.350602 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54ae4f6b-1cd9-497b-9665-25b570793997" (UID: "54ae4f6b-1cd9-497b-9665-25b570793997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.353616 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-config-data" (OuterVolumeSpecName: "config-data") pod "54ae4f6b-1cd9-497b-9665-25b570793997" (UID: "54ae4f6b-1cd9-497b-9665-25b570793997"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.423331 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.423373 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqlpg\" (UniqueName: \"kubernetes.io/projected/54ae4f6b-1cd9-497b-9665-25b570793997-kube-api-access-nqlpg\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.423386 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.423396 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ae4f6b-1cd9-497b-9665-25b570793997-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.897234 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" event={"ID":"54ae4f6b-1cd9-497b-9665-25b570793997","Type":"ContainerDied","Data":"9997ab8a075d78565b850528fdd2afa40860f80526fbd405d7eba44aaf9a4224"} Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.897282 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9997ab8a075d78565b850528fdd2afa40860f80526fbd405d7eba44aaf9a4224" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.897324 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmtt7" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.984849 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:57:52 crc kubenswrapper[4822]: E1010 07:57:52.985439 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ae4f6b-1cd9-497b-9665-25b570793997" containerName="nova-cell0-conductor-db-sync" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.985462 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ae4f6b-1cd9-497b-9665-25b570793997" containerName="nova-cell0-conductor-db-sync" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.985676 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ae4f6b-1cd9-497b-9665-25b570793997" containerName="nova-cell0-conductor-db-sync" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.986588 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.989388 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-d2wrj" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.994439 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 07:57:52 crc kubenswrapper[4822]: I1010 07:57:52.994663 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.034828 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.034974 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfrq\" (UniqueName: \"kubernetes.io/projected/f619e646-b85d-40a4-bb47-67db89884281-kube-api-access-gjfrq\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.035242 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.136417 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.136791 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.136847 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfrq\" (UniqueName: \"kubernetes.io/projected/f619e646-b85d-40a4-bb47-67db89884281-kube-api-access-gjfrq\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.140742 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.140795 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.158727 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfrq\" (UniqueName: \"kubernetes.io/projected/f619e646-b85d-40a4-bb47-67db89884281-kube-api-access-gjfrq\") pod \"nova-cell0-conductor-0\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.334927 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.570636 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.571515 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.627627 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.807477 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:57:53 crc kubenswrapper[4822]: I1010 07:57:53.916993 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f619e646-b85d-40a4-bb47-67db89884281","Type":"ContainerStarted","Data":"470dcd15cccb11fdd8869fb742ba49a7fb2e9e1b2dea130903c7381ecf705482"} Oct 10 07:57:54 crc kubenswrapper[4822]: I1010 07:57:54.001220 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:54 crc kubenswrapper[4822]: I1010 07:57:54.073449 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kx2k"] Oct 10 07:57:54 crc kubenswrapper[4822]: I1010 07:57:54.928912 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f619e646-b85d-40a4-bb47-67db89884281","Type":"ContainerStarted","Data":"5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24"} Oct 10 07:57:54 crc kubenswrapper[4822]: I1010 07:57:54.955624 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.955606064 podStartE2EDuration="2.955606064s" podCreationTimestamp="2025-10-10 07:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:57:54.95095935 +0000 UTC m=+5622.046117566" watchObservedRunningTime="2025-10-10 07:57:54.955606064 +0000 UTC m=+5622.050764260" Oct 10 07:57:55 crc kubenswrapper[4822]: I1010 07:57:55.935940 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9kx2k" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="registry-server" containerID="cri-o://b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93" gracePeriod=2 Oct 10 07:57:55 crc kubenswrapper[4822]: I1010 07:57:55.936167 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.370027 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.502740 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm54d\" (UniqueName: \"kubernetes.io/projected/a8757d75-bb34-40d3-afa6-d6e1f33be10b-kube-api-access-rm54d\") pod \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.502965 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-utilities\") pod \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.503003 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-catalog-content\") pod \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\" (UID: \"a8757d75-bb34-40d3-afa6-d6e1f33be10b\") " Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.504035 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-utilities" (OuterVolumeSpecName: "utilities") pod "a8757d75-bb34-40d3-afa6-d6e1f33be10b" (UID: "a8757d75-bb34-40d3-afa6-d6e1f33be10b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.508056 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8757d75-bb34-40d3-afa6-d6e1f33be10b-kube-api-access-rm54d" (OuterVolumeSpecName: "kube-api-access-rm54d") pod "a8757d75-bb34-40d3-afa6-d6e1f33be10b" (UID: "a8757d75-bb34-40d3-afa6-d6e1f33be10b"). InnerVolumeSpecName "kube-api-access-rm54d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.604999 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.605044 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm54d\" (UniqueName: \"kubernetes.io/projected/a8757d75-bb34-40d3-afa6-d6e1f33be10b-kube-api-access-rm54d\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.952364 4822 generic.go:334] "Generic (PLEG): container finished" podID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerID="b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93" exitCode=0 Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.952453 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerDied","Data":"b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93"} Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.952904 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kx2k" event={"ID":"a8757d75-bb34-40d3-afa6-d6e1f33be10b","Type":"ContainerDied","Data":"2b7fc022c885b8c05989297a25db28e2148e5b1764b54f5af0f5ffdc4fbfcba8"} Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.952950 4822 scope.go:117] "RemoveContainer" containerID="b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.952482 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kx2k" Oct 10 07:57:56 crc kubenswrapper[4822]: I1010 07:57:56.995997 4822 scope.go:117] "RemoveContainer" containerID="08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.022744 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8757d75-bb34-40d3-afa6-d6e1f33be10b" (UID: "a8757d75-bb34-40d3-afa6-d6e1f33be10b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.055240 4822 scope.go:117] "RemoveContainer" containerID="af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.091022 4822 scope.go:117] "RemoveContainer" containerID="b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93" Oct 10 07:57:57 crc kubenswrapper[4822]: E1010 07:57:57.091711 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93\": container with ID starting with b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93 not found: ID does not exist" containerID="b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.091765 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93"} err="failed to get container status \"b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93\": rpc error: code = NotFound desc = could not find container \"b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93\": container with ID starting with b73d839bf95d2578ab9dec7483da8a51b80c9936e3760e10420f811c38ea8a93 not found: ID does not exist" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.091817 4822 scope.go:117] "RemoveContainer" containerID="08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e" Oct 10 07:57:57 crc kubenswrapper[4822]: E1010 07:57:57.092453 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e\": container with ID starting with 08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e not found: ID does not exist" containerID="08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.092491 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e"} err="failed to get container status \"08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e\": rpc error: code = NotFound desc = could not find container \"08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e\": container with ID starting with 08992730ae718a4a86abddfebf9c8fc2026a2e6bac32edc1820e04235cd2a84e not found: ID does not exist" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.092519 4822 scope.go:117] "RemoveContainer" containerID="af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e" Oct 10 07:57:57 crc kubenswrapper[4822]: E1010 07:57:57.093074 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e\": container with ID starting with af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e not found: ID does not exist" containerID="af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.093108 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e"} err="failed to get container status \"af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e\": rpc error: code = NotFound desc = could not find container \"af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e\": container with ID starting with af736e3b22fcb2604c69037ed80d589211335bfb7012228e3b202527f632db5e not found: ID does not exist" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.112526 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8757d75-bb34-40d3-afa6-d6e1f33be10b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.296875 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kx2k"] Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.309896 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9kx2k"] Oct 10 07:57:57 crc kubenswrapper[4822]: I1010 07:57:57.691652 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" path="/var/lib/kubelet/pods/a8757d75-bb34-40d3-afa6-d6e1f33be10b/volumes" Oct 10 07:58:03 crc kubenswrapper[4822]: I1010 07:58:03.383676 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.025538 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lmd99"] Oct 10 07:58:04 crc kubenswrapper[4822]: E1010 07:58:04.025943 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="registry-server" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.025962 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="registry-server" Oct 10 07:58:04 crc kubenswrapper[4822]: E1010 07:58:04.025980 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="extract-content" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.025989 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="extract-content" Oct 10 07:58:04 crc kubenswrapper[4822]: E1010 07:58:04.026016 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="extract-utilities" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.026025 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="extract-utilities" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.026228 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8757d75-bb34-40d3-afa6-d6e1f33be10b" containerName="registry-server" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.027122 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.038422 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.043076 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.046508 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lmd99"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.137172 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.139277 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.149322 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.156230 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-scripts\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.156515 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.156610 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.156696 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8mzf\" (UniqueName: \"kubernetes.io/projected/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-kube-api-access-r8mzf\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.161336 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.221826 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.222952 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.227748 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.247266 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.257841 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.257927 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-config-data\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.258053 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.258167 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmb8j\" (UniqueName: \"kubernetes.io/projected/003ec4c6-ed7a-4965-8643-4bb9bae5a896-kube-api-access-fmb8j\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.258193 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.258246 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8mzf\" (UniqueName: \"kubernetes.io/projected/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-kube-api-access-r8mzf\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.258263 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003ec4c6-ed7a-4965-8643-4bb9bae5a896-logs\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.258362 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-scripts\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.261205 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.262439 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.269660 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-scripts\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.272855 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.282790 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.283064 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.295306 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8mzf\" (UniqueName: \"kubernetes.io/projected/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-kube-api-access-r8mzf\") pod \"nova-cell0-cell-mapping-lmd99\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.301402 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.358597 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.360097 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361425 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmb8j\" (UniqueName: \"kubernetes.io/projected/003ec4c6-ed7a-4965-8643-4bb9bae5a896-kube-api-access-fmb8j\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361463 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgjq8\" (UniqueName: \"kubernetes.io/projected/a28eed5e-47a9-4935-a7c7-876994710f01-kube-api-access-vgjq8\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361494 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003ec4c6-ed7a-4965-8643-4bb9bae5a896-logs\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361616 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361647 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lj7\" (UniqueName: \"kubernetes.io/projected/8b82e92e-a46c-4015-9455-ee5319632827-kube-api-access-b9lj7\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361674 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361715 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361742 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-config-data\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.361775 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-config-data\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.362654 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.362848 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003ec4c6-ed7a-4965-8643-4bb9bae5a896-logs\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.363678 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.378277 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-config-data\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.382837 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.385876 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmb8j\" (UniqueName: \"kubernetes.io/projected/003ec4c6-ed7a-4965-8643-4bb9bae5a896-kube-api-access-fmb8j\") pod \"nova-api-0\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.403526 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.456603 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bc866d9fc-fdpnr"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.458468 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.464847 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lj7\" (UniqueName: \"kubernetes.io/projected/8b82e92e-a46c-4015-9455-ee5319632827-kube-api-access-b9lj7\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.464932 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.464980 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465021 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465041 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24gg2\" (UniqueName: \"kubernetes.io/projected/934a03ec-4884-4c40-a555-d1a988c9f60a-kube-api-access-24gg2\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465066 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-config-data\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465095 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934a03ec-4884-4c40-a555-d1a988c9f60a-logs\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465163 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgjq8\" (UniqueName: \"kubernetes.io/projected/a28eed5e-47a9-4935-a7c7-876994710f01-kube-api-access-vgjq8\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-config-data\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.465512 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.469949 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.479309 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc866d9fc-fdpnr"] Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.480521 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.482550 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-config-data\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.486071 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.494941 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgjq8\" (UniqueName: \"kubernetes.io/projected/a28eed5e-47a9-4935-a7c7-876994710f01-kube-api-access-vgjq8\") pod \"nova-scheduler-0\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.495179 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lj7\" (UniqueName: \"kubernetes.io/projected/8b82e92e-a46c-4015-9455-ee5319632827-kube-api-access-b9lj7\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.539476 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.567300 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-config\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.567383 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phlql\" (UniqueName: \"kubernetes.io/projected/427370c4-f63e-43d6-a48b-e5b64abd66be-kube-api-access-phlql\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.567928 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-config-data\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.568168 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.568222 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.568283 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.568394 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24gg2\" (UniqueName: \"kubernetes.io/projected/934a03ec-4884-4c40-a555-d1a988c9f60a-kube-api-access-24gg2\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.568482 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-dns-svc\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.568529 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934a03ec-4884-4c40-a555-d1a988c9f60a-logs\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.569291 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934a03ec-4884-4c40-a555-d1a988c9f60a-logs\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.573698 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-config-data\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.575140 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.587930 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24gg2\" (UniqueName: \"kubernetes.io/projected/934a03ec-4884-4c40-a555-d1a988c9f60a-kube-api-access-24gg2\") pod \"nova-metadata-0\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.670908 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phlql\" (UniqueName: \"kubernetes.io/projected/427370c4-f63e-43d6-a48b-e5b64abd66be-kube-api-access-phlql\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.671047 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.671083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.671112 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-dns-svc\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.671140 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-config\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.672037 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-config\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.673189 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.673695 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-dns-svc\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.674006 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.707498 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phlql\" (UniqueName: \"kubernetes.io/projected/427370c4-f63e-43d6-a48b-e5b64abd66be-kube-api-access-phlql\") pod \"dnsmasq-dns-7bc866d9fc-fdpnr\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.766372 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.824319 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.837318 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:04 crc kubenswrapper[4822]: I1010 07:58:04.938390 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lmd99"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.068685 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lmd99" event={"ID":"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a","Type":"ContainerStarted","Data":"a40a140807089d00333fbda778bb0d0fecf5febcabdd1a3af73f8274b2459424"} Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.178917 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.246735 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.338332 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:05 crc kubenswrapper[4822]: W1010 07:58:05.338374 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda28eed5e_47a9_4935_a7c7_876994710f01.slice/crio-1997172b07d52d605cdc3d8d23d2e21615c3d7204958c7e714860aaef7ecaa02 WatchSource:0}: Error finding container 1997172b07d52d605cdc3d8d23d2e21615c3d7204958c7e714860aaef7ecaa02: Status 404 returned error can't find the container with id 1997172b07d52d605cdc3d8d23d2e21615c3d7204958c7e714860aaef7ecaa02 Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.440610 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.457772 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wt8m4"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.459205 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.462160 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.462571 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.469774 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc866d9fc-fdpnr"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.480896 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wt8m4"] Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.615060 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-scripts\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.615430 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.615553 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt9mj\" (UniqueName: \"kubernetes.io/projected/10209d33-8de9-4152-a66e-e34b045618b4-kube-api-access-mt9mj\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.615638 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-config-data\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.716925 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.717054 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt9mj\" (UniqueName: \"kubernetes.io/projected/10209d33-8de9-4152-a66e-e34b045618b4-kube-api-access-mt9mj\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.717084 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-config-data\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.717171 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-scripts\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.723408 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-scripts\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.724680 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.724843 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-config-data\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.737120 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt9mj\" (UniqueName: \"kubernetes.io/projected/10209d33-8de9-4152-a66e-e34b045618b4-kube-api-access-mt9mj\") pod \"nova-cell1-conductor-db-sync-wt8m4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:05 crc kubenswrapper[4822]: I1010 07:58:05.836100 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.103338 4822 generic.go:334] "Generic (PLEG): container finished" podID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerID="18a1a9e7dcf699613f1826b3d76e1d7fb9dece54a10d10986413860ed8077493" exitCode=0 Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.103947 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" event={"ID":"427370c4-f63e-43d6-a48b-e5b64abd66be","Type":"ContainerDied","Data":"18a1a9e7dcf699613f1826b3d76e1d7fb9dece54a10d10986413860ed8077493"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.104082 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" event={"ID":"427370c4-f63e-43d6-a48b-e5b64abd66be","Type":"ContainerStarted","Data":"8cae83955167e90417dc631d24ac90edae520e8061bd4229236a988122d51eb8"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.121130 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b82e92e-a46c-4015-9455-ee5319632827","Type":"ContainerStarted","Data":"c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.121194 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b82e92e-a46c-4015-9455-ee5319632827","Type":"ContainerStarted","Data":"1db359ffeead417531fa60628c2e4f7a2d39dc010b3f83830cd163ee6fe101b3"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.137893 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"003ec4c6-ed7a-4965-8643-4bb9bae5a896","Type":"ContainerStarted","Data":"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.137957 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"003ec4c6-ed7a-4965-8643-4bb9bae5a896","Type":"ContainerStarted","Data":"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.137979 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"003ec4c6-ed7a-4965-8643-4bb9bae5a896","Type":"ContainerStarted","Data":"b52b844ae49b69a5a0d4fdb3cf96b5a5fb9336d05ac44ab4a769413cc4c97b6b"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.149325 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a28eed5e-47a9-4935-a7c7-876994710f01","Type":"ContainerStarted","Data":"1691843ee79b16d9ed91cbb09ef72834b05b15c06ffc039415197ae009fb180d"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.150655 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a28eed5e-47a9-4935-a7c7-876994710f01","Type":"ContainerStarted","Data":"1997172b07d52d605cdc3d8d23d2e21615c3d7204958c7e714860aaef7ecaa02"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.162794 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.162766296 podStartE2EDuration="2.162766296s" podCreationTimestamp="2025-10-10 07:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:06.151233423 +0000 UTC m=+5633.246391649" watchObservedRunningTime="2025-10-10 07:58:06.162766296 +0000 UTC m=+5633.257924492" Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.166549 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lmd99" event={"ID":"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a","Type":"ContainerStarted","Data":"008e14d1ecfa29f5cd6fbec1ead9cc560e14dde0a74db23761b2e030d4a9f872"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.177670 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"934a03ec-4884-4c40-a555-d1a988c9f60a","Type":"ContainerStarted","Data":"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.177724 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"934a03ec-4884-4c40-a555-d1a988c9f60a","Type":"ContainerStarted","Data":"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.177733 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"934a03ec-4884-4c40-a555-d1a988c9f60a","Type":"ContainerStarted","Data":"9c65566aac4d812eb7a5fef9e4d00d52f7a0c03146fe92e553ee94800333f606"} Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.187925 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.18789399 podStartE2EDuration="2.18789399s" podCreationTimestamp="2025-10-10 07:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:06.167204784 +0000 UTC m=+5633.262363000" watchObservedRunningTime="2025-10-10 07:58:06.18789399 +0000 UTC m=+5633.283052176" Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.197247 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.197227609 podStartE2EDuration="2.197227609s" podCreationTimestamp="2025-10-10 07:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:06.185530822 +0000 UTC m=+5633.280689018" watchObservedRunningTime="2025-10-10 07:58:06.197227609 +0000 UTC m=+5633.292385805" Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.215791 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.215752363 podStartE2EDuration="2.215752363s" podCreationTimestamp="2025-10-10 07:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:06.207223037 +0000 UTC m=+5633.302381243" watchObservedRunningTime="2025-10-10 07:58:06.215752363 +0000 UTC m=+5633.310910569" Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.241904 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lmd99" podStartSLOduration=2.241875017 podStartE2EDuration="2.241875017s" podCreationTimestamp="2025-10-10 07:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:06.223685242 +0000 UTC m=+5633.318843438" watchObservedRunningTime="2025-10-10 07:58:06.241875017 +0000 UTC m=+5633.337033213" Oct 10 07:58:06 crc kubenswrapper[4822]: I1010 07:58:06.314007 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wt8m4"] Oct 10 07:58:06 crc kubenswrapper[4822]: W1010 07:58:06.319979 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10209d33_8de9_4152_a66e_e34b045618b4.slice/crio-839ab7fd5d8cbc5611a138cf7b96fba243ded7d9b2c735c962779237c505045c WatchSource:0}: Error finding container 839ab7fd5d8cbc5611a138cf7b96fba243ded7d9b2c735c962779237c505045c: Status 404 returned error can't find the container with id 839ab7fd5d8cbc5611a138cf7b96fba243ded7d9b2c735c962779237c505045c Oct 10 07:58:07 crc kubenswrapper[4822]: I1010 07:58:07.186742 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" event={"ID":"427370c4-f63e-43d6-a48b-e5b64abd66be","Type":"ContainerStarted","Data":"113da4b3073efe7f9ca2fe6d65dd40b1f912a8bdaf36a9dd554f0c85ad823093"} Oct 10 07:58:07 crc kubenswrapper[4822]: I1010 07:58:07.187439 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:07 crc kubenswrapper[4822]: I1010 07:58:07.191201 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" event={"ID":"10209d33-8de9-4152-a66e-e34b045618b4","Type":"ContainerStarted","Data":"a9af087ffe6d175ba49fb19b7a4029cf02b0fc8d7c57e0dbf3aa864c0e4befe9"} Oct 10 07:58:07 crc kubenswrapper[4822]: I1010 07:58:07.191246 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" event={"ID":"10209d33-8de9-4152-a66e-e34b045618b4","Type":"ContainerStarted","Data":"839ab7fd5d8cbc5611a138cf7b96fba243ded7d9b2c735c962779237c505045c"} Oct 10 07:58:07 crc kubenswrapper[4822]: I1010 07:58:07.235209 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" podStartSLOduration=3.235189036 podStartE2EDuration="3.235189036s" podCreationTimestamp="2025-10-10 07:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:07.213187022 +0000 UTC m=+5634.308345228" watchObservedRunningTime="2025-10-10 07:58:07.235189036 +0000 UTC m=+5634.330347232" Oct 10 07:58:07 crc kubenswrapper[4822]: I1010 07:58:07.242856 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" podStartSLOduration=2.242830956 podStartE2EDuration="2.242830956s" podCreationTimestamp="2025-10-10 07:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:07.233504257 +0000 UTC m=+5634.328662463" watchObservedRunningTime="2025-10-10 07:58:07.242830956 +0000 UTC m=+5634.337989162" Oct 10 07:58:09 crc kubenswrapper[4822]: I1010 07:58:09.539897 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:09 crc kubenswrapper[4822]: I1010 07:58:09.767088 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:58:09 crc kubenswrapper[4822]: I1010 07:58:09.825233 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:58:09 crc kubenswrapper[4822]: I1010 07:58:09.825304 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:58:10 crc kubenswrapper[4822]: I1010 07:58:10.219363 4822 generic.go:334] "Generic (PLEG): container finished" podID="c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" containerID="008e14d1ecfa29f5cd6fbec1ead9cc560e14dde0a74db23761b2e030d4a9f872" exitCode=0 Oct 10 07:58:10 crc kubenswrapper[4822]: I1010 07:58:10.219436 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lmd99" event={"ID":"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a","Type":"ContainerDied","Data":"008e14d1ecfa29f5cd6fbec1ead9cc560e14dde0a74db23761b2e030d4a9f872"} Oct 10 07:58:10 crc kubenswrapper[4822]: I1010 07:58:10.222838 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" event={"ID":"10209d33-8de9-4152-a66e-e34b045618b4","Type":"ContainerDied","Data":"a9af087ffe6d175ba49fb19b7a4029cf02b0fc8d7c57e0dbf3aa864c0e4befe9"} Oct 10 07:58:10 crc kubenswrapper[4822]: I1010 07:58:10.222971 4822 generic.go:334] "Generic (PLEG): container finished" podID="10209d33-8de9-4152-a66e-e34b045618b4" containerID="a9af087ffe6d175ba49fb19b7a4029cf02b0fc8d7c57e0dbf3aa864c0e4befe9" exitCode=0 Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.677862 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.684566 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.868674 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8mzf\" (UniqueName: \"kubernetes.io/projected/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-kube-api-access-r8mzf\") pod \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869041 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-combined-ca-bundle\") pod \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869105 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt9mj\" (UniqueName: \"kubernetes.io/projected/10209d33-8de9-4152-a66e-e34b045618b4-kube-api-access-mt9mj\") pod \"10209d33-8de9-4152-a66e-e34b045618b4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869222 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-config-data\") pod \"10209d33-8de9-4152-a66e-e34b045618b4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869301 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-combined-ca-bundle\") pod \"10209d33-8de9-4152-a66e-e34b045618b4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869416 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-scripts\") pod \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869518 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data\") pod \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.869596 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-scripts\") pod \"10209d33-8de9-4152-a66e-e34b045618b4\" (UID: \"10209d33-8de9-4152-a66e-e34b045618b4\") " Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.877142 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-kube-api-access-r8mzf" (OuterVolumeSpecName: "kube-api-access-r8mzf") pod "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" (UID: "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a"). InnerVolumeSpecName "kube-api-access-r8mzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.882962 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-scripts" (OuterVolumeSpecName: "scripts") pod "10209d33-8de9-4152-a66e-e34b045618b4" (UID: "10209d33-8de9-4152-a66e-e34b045618b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.883256 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10209d33-8de9-4152-a66e-e34b045618b4-kube-api-access-mt9mj" (OuterVolumeSpecName: "kube-api-access-mt9mj") pod "10209d33-8de9-4152-a66e-e34b045618b4" (UID: "10209d33-8de9-4152-a66e-e34b045618b4"). InnerVolumeSpecName "kube-api-access-mt9mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.885999 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-scripts" (OuterVolumeSpecName: "scripts") pod "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" (UID: "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.901563 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-config-data" (OuterVolumeSpecName: "config-data") pod "10209d33-8de9-4152-a66e-e34b045618b4" (UID: "10209d33-8de9-4152-a66e-e34b045618b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.901743 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10209d33-8de9-4152-a66e-e34b045618b4" (UID: "10209d33-8de9-4152-a66e-e34b045618b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: E1010 07:58:11.903642 4822 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data podName:c1dfed5c-0c78-4bfb-b4e5-bf19d986619a nodeName:}" failed. No retries permitted until 2025-10-10 07:58:12.403599304 +0000 UTC m=+5639.498757510 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data") pod "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" (UID: "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a") : error deleting /var/lib/kubelet/pods/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a/volume-subpaths: remove /var/lib/kubelet/pods/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a/volume-subpaths: no such file or directory Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.908599 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" (UID: "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973120 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973488 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973562 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973617 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10209d33-8de9-4152-a66e-e34b045618b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973669 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8mzf\" (UniqueName: \"kubernetes.io/projected/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-kube-api-access-r8mzf\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973721 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:11 crc kubenswrapper[4822]: I1010 07:58:11.973843 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt9mj\" (UniqueName: \"kubernetes.io/projected/10209d33-8de9-4152-a66e-e34b045618b4-kube-api-access-mt9mj\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.271176 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lmd99" event={"ID":"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a","Type":"ContainerDied","Data":"a40a140807089d00333fbda778bb0d0fecf5febcabdd1a3af73f8274b2459424"} Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.271258 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a40a140807089d00333fbda778bb0d0fecf5febcabdd1a3af73f8274b2459424" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.271371 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lmd99" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.279290 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" event={"ID":"10209d33-8de9-4152-a66e-e34b045618b4","Type":"ContainerDied","Data":"839ab7fd5d8cbc5611a138cf7b96fba243ded7d9b2c735c962779237c505045c"} Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.279353 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839ab7fd5d8cbc5611a138cf7b96fba243ded7d9b2c735c962779237c505045c" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.279433 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wt8m4" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.339888 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:58:12 crc kubenswrapper[4822]: E1010 07:58:12.340364 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10209d33-8de9-4152-a66e-e34b045618b4" containerName="nova-cell1-conductor-db-sync" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.340395 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="10209d33-8de9-4152-a66e-e34b045618b4" containerName="nova-cell1-conductor-db-sync" Oct 10 07:58:12 crc kubenswrapper[4822]: E1010 07:58:12.340434 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" containerName="nova-manage" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.340449 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" containerName="nova-manage" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.340768 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" containerName="nova-manage" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.340828 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="10209d33-8de9-4152-a66e-e34b045618b4" containerName="nova-cell1-conductor-db-sync" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.341634 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.344135 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.355284 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.384613 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.384745 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.384797 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfcwf\" (UniqueName: \"kubernetes.io/projected/53e314d1-6335-4c39-a8a2-d164cdd11d9a-kube-api-access-pfcwf\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.486133 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data\") pod \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\" (UID: \"c1dfed5c-0c78-4bfb-b4e5-bf19d986619a\") " Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.486761 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.487334 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.487385 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfcwf\" (UniqueName: \"kubernetes.io/projected/53e314d1-6335-4c39-a8a2-d164cdd11d9a-kube-api-access-pfcwf\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.491403 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data" (OuterVolumeSpecName: "config-data") pod "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" (UID: "c1dfed5c-0c78-4bfb-b4e5-bf19d986619a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.491440 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.497068 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.509151 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.509387 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-log" containerID="cri-o://e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5" gracePeriod=30 Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.509458 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-api" containerID="cri-o://5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e" gracePeriod=30 Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.512757 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfcwf\" (UniqueName: \"kubernetes.io/projected/53e314d1-6335-4c39-a8a2-d164cdd11d9a-kube-api-access-pfcwf\") pod \"nova-cell1-conductor-0\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.539026 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.539294 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a28eed5e-47a9-4935-a7c7-876994710f01" containerName="nova-scheduler-scheduler" containerID="cri-o://1691843ee79b16d9ed91cbb09ef72834b05b15c06ffc039415197ae009fb180d" gracePeriod=30 Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.553280 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.553588 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-log" containerID="cri-o://c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d" gracePeriod=30 Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.554831 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-metadata" containerID="cri-o://3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b" gracePeriod=30 Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.589012 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:12 crc kubenswrapper[4822]: I1010 07:58:12.709330 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.177173 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.185024 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:13 crc kubenswrapper[4822]: W1010 07:58:13.283558 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53e314d1_6335_4c39_a8a2_d164cdd11d9a.slice/crio-89253db5c204baf9502c2e71b827139bc2668d5be7e9955cc9208ab05363290a WatchSource:0}: Error finding container 89253db5c204baf9502c2e71b827139bc2668d5be7e9955cc9208ab05363290a: Status 404 returned error can't find the container with id 89253db5c204baf9502c2e71b827139bc2668d5be7e9955cc9208ab05363290a Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.285749 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291033 4822 generic.go:334] "Generic (PLEG): container finished" podID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerID="3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b" exitCode=0 Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291062 4822 generic.go:334] "Generic (PLEG): container finished" podID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerID="c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d" exitCode=143 Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291094 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"934a03ec-4884-4c40-a555-d1a988c9f60a","Type":"ContainerDied","Data":"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b"} Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291120 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"934a03ec-4884-4c40-a555-d1a988c9f60a","Type":"ContainerDied","Data":"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d"} Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"934a03ec-4884-4c40-a555-d1a988c9f60a","Type":"ContainerDied","Data":"9c65566aac4d812eb7a5fef9e4d00d52f7a0c03146fe92e553ee94800333f606"} Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291143 4822 scope.go:117] "RemoveContainer" containerID="3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.291236 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.295187 4822 generic.go:334] "Generic (PLEG): container finished" podID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerID="5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e" exitCode=0 Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.295207 4822 generic.go:334] "Generic (PLEG): container finished" podID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerID="e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5" exitCode=143 Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.295225 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"003ec4c6-ed7a-4965-8643-4bb9bae5a896","Type":"ContainerDied","Data":"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e"} Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.295246 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"003ec4c6-ed7a-4965-8643-4bb9bae5a896","Type":"ContainerDied","Data":"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5"} Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.295259 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"003ec4c6-ed7a-4965-8643-4bb9bae5a896","Type":"ContainerDied","Data":"b52b844ae49b69a5a0d4fdb3cf96b5a5fb9336d05ac44ab4a769413cc4c97b6b"} Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.295262 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301474 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-combined-ca-bundle\") pod \"934a03ec-4884-4c40-a555-d1a988c9f60a\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301534 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003ec4c6-ed7a-4965-8643-4bb9bae5a896-logs\") pod \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301630 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-combined-ca-bundle\") pod \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301700 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24gg2\" (UniqueName: \"kubernetes.io/projected/934a03ec-4884-4c40-a555-d1a988c9f60a-kube-api-access-24gg2\") pod \"934a03ec-4884-4c40-a555-d1a988c9f60a\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301789 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934a03ec-4884-4c40-a555-d1a988c9f60a-logs\") pod \"934a03ec-4884-4c40-a555-d1a988c9f60a\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301865 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmb8j\" (UniqueName: \"kubernetes.io/projected/003ec4c6-ed7a-4965-8643-4bb9bae5a896-kube-api-access-fmb8j\") pod \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301960 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-config-data\") pod \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\" (UID: \"003ec4c6-ed7a-4965-8643-4bb9bae5a896\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.301999 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-config-data\") pod \"934a03ec-4884-4c40-a555-d1a988c9f60a\" (UID: \"934a03ec-4884-4c40-a555-d1a988c9f60a\") " Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.302570 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934a03ec-4884-4c40-a555-d1a988c9f60a-logs" (OuterVolumeSpecName: "logs") pod "934a03ec-4884-4c40-a555-d1a988c9f60a" (UID: "934a03ec-4884-4c40-a555-d1a988c9f60a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.303164 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003ec4c6-ed7a-4965-8643-4bb9bae5a896-logs" (OuterVolumeSpecName: "logs") pod "003ec4c6-ed7a-4965-8643-4bb9bae5a896" (UID: "003ec4c6-ed7a-4965-8643-4bb9bae5a896"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.305271 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003ec4c6-ed7a-4965-8643-4bb9bae5a896-kube-api-access-fmb8j" (OuterVolumeSpecName: "kube-api-access-fmb8j") pod "003ec4c6-ed7a-4965-8643-4bb9bae5a896" (UID: "003ec4c6-ed7a-4965-8643-4bb9bae5a896"). InnerVolumeSpecName "kube-api-access-fmb8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.305280 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934a03ec-4884-4c40-a555-d1a988c9f60a-kube-api-access-24gg2" (OuterVolumeSpecName: "kube-api-access-24gg2") pod "934a03ec-4884-4c40-a555-d1a988c9f60a" (UID: "934a03ec-4884-4c40-a555-d1a988c9f60a"). InnerVolumeSpecName "kube-api-access-24gg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.314618 4822 scope.go:117] "RemoveContainer" containerID="c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.325040 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-config-data" (OuterVolumeSpecName: "config-data") pod "003ec4c6-ed7a-4965-8643-4bb9bae5a896" (UID: "003ec4c6-ed7a-4965-8643-4bb9bae5a896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.325230 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "934a03ec-4884-4c40-a555-d1a988c9f60a" (UID: "934a03ec-4884-4c40-a555-d1a988c9f60a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.336736 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-config-data" (OuterVolumeSpecName: "config-data") pod "934a03ec-4884-4c40-a555-d1a988c9f60a" (UID: "934a03ec-4884-4c40-a555-d1a988c9f60a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.345616 4822 scope.go:117] "RemoveContainer" containerID="3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.345985 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b\": container with ID starting with 3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b not found: ID does not exist" containerID="3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346018 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b"} err="failed to get container status \"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b\": rpc error: code = NotFound desc = could not find container \"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b\": container with ID starting with 3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346041 4822 scope.go:117] "RemoveContainer" containerID="c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.346240 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d\": container with ID starting with c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d not found: ID does not exist" containerID="c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346258 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d"} err="failed to get container status \"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d\": rpc error: code = NotFound desc = could not find container \"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d\": container with ID starting with c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346272 4822 scope.go:117] "RemoveContainer" containerID="3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346710 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b"} err="failed to get container status \"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b\": rpc error: code = NotFound desc = could not find container \"3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b\": container with ID starting with 3f391e90bccac476f4d796de42b4a8a094d8010ebfe86c324c9593d5840b190b not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346733 4822 scope.go:117] "RemoveContainer" containerID="c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346973 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d"} err="failed to get container status \"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d\": rpc error: code = NotFound desc = could not find container \"c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d\": container with ID starting with c4fc30aad1ec5c52385f7ed2195c1e9b04ecd4f26ece00173d8ccb24cff4202d not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.346996 4822 scope.go:117] "RemoveContainer" containerID="5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.361947 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "003ec4c6-ed7a-4965-8643-4bb9bae5a896" (UID: "003ec4c6-ed7a-4965-8643-4bb9bae5a896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.377020 4822 scope.go:117] "RemoveContainer" containerID="e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.394055 4822 scope.go:117] "RemoveContainer" containerID="5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.394418 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e\": container with ID starting with 5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e not found: ID does not exist" containerID="5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.394463 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e"} err="failed to get container status \"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e\": rpc error: code = NotFound desc = could not find container \"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e\": container with ID starting with 5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.394491 4822 scope.go:117] "RemoveContainer" containerID="e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.394722 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5\": container with ID starting with e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5 not found: ID does not exist" containerID="e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.394753 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5"} err="failed to get container status \"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5\": rpc error: code = NotFound desc = could not find container \"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5\": container with ID starting with e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5 not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.394772 4822 scope.go:117] "RemoveContainer" containerID="5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.395303 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e"} err="failed to get container status \"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e\": rpc error: code = NotFound desc = could not find container \"5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e\": container with ID starting with 5a64e3aa91e83fb371e9d2788b813f96f2dae87480700f1004afcb4613cbdc8e not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.395351 4822 scope.go:117] "RemoveContainer" containerID="e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.395641 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5"} err="failed to get container status \"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5\": rpc error: code = NotFound desc = could not find container \"e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5\": container with ID starting with e953399597b9cf26a94d80d6c6f6ae0681fe70e654c4896829952c41924fced5 not found: ID does not exist" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403727 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003ec4c6-ed7a-4965-8643-4bb9bae5a896-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403759 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403770 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24gg2\" (UniqueName: \"kubernetes.io/projected/934a03ec-4884-4c40-a555-d1a988c9f60a-kube-api-access-24gg2\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403779 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934a03ec-4884-4c40-a555-d1a988c9f60a-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403787 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmb8j\" (UniqueName: \"kubernetes.io/projected/003ec4c6-ed7a-4965-8643-4bb9bae5a896-kube-api-access-fmb8j\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403807 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003ec4c6-ed7a-4965-8643-4bb9bae5a896-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403816 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.403826 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934a03ec-4884-4c40-a555-d1a988c9f60a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.687137 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.714127 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.724641 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.739862 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.752422 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.752916 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-api" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.752959 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-api" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.752977 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-log" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.752985 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-log" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.753008 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-metadata" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.753016 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-metadata" Oct 10 07:58:13 crc kubenswrapper[4822]: E1010 07:58:13.753037 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-log" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.753045 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-log" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.753261 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-api" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.753290 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" containerName="nova-api-log" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.753313 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-metadata" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.753331 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" containerName="nova-metadata-log" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.754512 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.757599 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.763314 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.778396 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.779943 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.782026 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.785770 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.923624 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead44ba-016f-4b64-803a-26ff3680b619-logs\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.923860 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-config-data\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.923919 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6081be80-9e45-46d3-9605-0c7f834cf385-logs\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.923983 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75h2z\" (UniqueName: \"kubernetes.io/projected/dead44ba-016f-4b64-803a-26ff3680b619-kube-api-access-75h2z\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.924087 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-config-data\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.924182 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.924283 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvmw7\" (UniqueName: \"kubernetes.io/projected/6081be80-9e45-46d3-9605-0c7f834cf385-kube-api-access-lvmw7\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:13 crc kubenswrapper[4822]: I1010 07:58:13.924363 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.026980 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead44ba-016f-4b64-803a-26ff3680b619-logs\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027069 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-config-data\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027104 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6081be80-9e45-46d3-9605-0c7f834cf385-logs\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027141 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75h2z\" (UniqueName: \"kubernetes.io/projected/dead44ba-016f-4b64-803a-26ff3680b619-kube-api-access-75h2z\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027192 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-config-data\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027245 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvmw7\" (UniqueName: \"kubernetes.io/projected/6081be80-9e45-46d3-9605-0c7f834cf385-kube-api-access-lvmw7\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.027319 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.028366 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead44ba-016f-4b64-803a-26ff3680b619-logs\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.029615 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6081be80-9e45-46d3-9605-0c7f834cf385-logs\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.033406 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.033689 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-config-data\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.034516 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.036499 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-config-data\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.044855 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvmw7\" (UniqueName: \"kubernetes.io/projected/6081be80-9e45-46d3-9605-0c7f834cf385-kube-api-access-lvmw7\") pod \"nova-metadata-0\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.059254 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75h2z\" (UniqueName: \"kubernetes.io/projected/dead44ba-016f-4b64-803a-26ff3680b619-kube-api-access-75h2z\") pod \"nova-api-0\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.076693 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.101563 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.307046 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"53e314d1-6335-4c39-a8a2-d164cdd11d9a","Type":"ContainerStarted","Data":"f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd"} Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.307507 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"53e314d1-6335-4c39-a8a2-d164cdd11d9a","Type":"ContainerStarted","Data":"89253db5c204baf9502c2e71b827139bc2668d5be7e9955cc9208ab05363290a"} Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.307956 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.327424 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.327386637 podStartE2EDuration="2.327386637s" podCreationTimestamp="2025-10-10 07:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:14.321797415 +0000 UTC m=+5641.416955621" watchObservedRunningTime="2025-10-10 07:58:14.327386637 +0000 UTC m=+5641.422544823" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.540161 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.540513 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.551998 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.685548 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.838791 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.907093 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7869c9d85c-xv8n5"] Oct 10 07:58:14 crc kubenswrapper[4822]: I1010 07:58:14.907301 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerName="dnsmasq-dns" containerID="cri-o://b5d18e77a7b45c64932a5b7b623579aea712bab93225be2fa3fcc6d917a65934" gracePeriod=10 Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.328410 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6081be80-9e45-46d3-9605-0c7f834cf385","Type":"ContainerStarted","Data":"5a8591629271912f5e22698b1ab1055c17e8d885204e7cdedac05639d64137d5"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.328734 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6081be80-9e45-46d3-9605-0c7f834cf385","Type":"ContainerStarted","Data":"02f29dbbdd4d20d1cd93a87bdbbfa002f28a297549e5649907c5d8794668905c"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.328753 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6081be80-9e45-46d3-9605-0c7f834cf385","Type":"ContainerStarted","Data":"7d44c897f98b98756d1c33fceea154ba2fa81a3c2a77dd3a7919eb6619a602c6"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.341672 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.341729 4822 generic.go:334] "Generic (PLEG): container finished" podID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerID="b5d18e77a7b45c64932a5b7b623579aea712bab93225be2fa3fcc6d917a65934" exitCode=0 Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.341787 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" event={"ID":"18e91592-f700-4c18-b07f-0abe2e262fd9","Type":"ContainerDied","Data":"b5d18e77a7b45c64932a5b7b623579aea712bab93225be2fa3fcc6d917a65934"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.341832 4822 scope.go:117] "RemoveContainer" containerID="b5d18e77a7b45c64932a5b7b623579aea712bab93225be2fa3fcc6d917a65934" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.348067 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.348049894 podStartE2EDuration="2.348049894s" podCreationTimestamp="2025-10-10 07:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:15.346272923 +0000 UTC m=+5642.441431119" watchObservedRunningTime="2025-10-10 07:58:15.348049894 +0000 UTC m=+5642.443208090" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.355782 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dead44ba-016f-4b64-803a-26ff3680b619","Type":"ContainerStarted","Data":"8e9444ccffb08cf7b7626adaf5323aa218664f9a98b08a30e4aa89463381dc49"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.355846 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dead44ba-016f-4b64-803a-26ff3680b619","Type":"ContainerStarted","Data":"ab50881e4c0a95433c8cd8c820df7f8480d19abd4f7e0313993a4ae45eb3af72"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.355860 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dead44ba-016f-4b64-803a-26ff3680b619","Type":"ContainerStarted","Data":"76029e8ab522512c1ff20e2ff7c23c6d2d705313205660990392b5ba5f136cef"} Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.365968 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.420639 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.420614366 podStartE2EDuration="2.420614366s" podCreationTimestamp="2025-10-10 07:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:15.391603449 +0000 UTC m=+5642.486761655" watchObservedRunningTime="2025-10-10 07:58:15.420614366 +0000 UTC m=+5642.515772572" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.427987 4822 scope.go:117] "RemoveContainer" containerID="8146239dbcce20485e7eeb7d6ff104665aaf9db1e6934d244427e97b34b9e83c" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.458473 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-nb\") pod \"18e91592-f700-4c18-b07f-0abe2e262fd9\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.458936 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrtft\" (UniqueName: \"kubernetes.io/projected/18e91592-f700-4c18-b07f-0abe2e262fd9-kube-api-access-qrtft\") pod \"18e91592-f700-4c18-b07f-0abe2e262fd9\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.458990 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-dns-svc\") pod \"18e91592-f700-4c18-b07f-0abe2e262fd9\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.459052 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-config\") pod \"18e91592-f700-4c18-b07f-0abe2e262fd9\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.459196 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-sb\") pod \"18e91592-f700-4c18-b07f-0abe2e262fd9\" (UID: \"18e91592-f700-4c18-b07f-0abe2e262fd9\") " Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.473883 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e91592-f700-4c18-b07f-0abe2e262fd9-kube-api-access-qrtft" (OuterVolumeSpecName: "kube-api-access-qrtft") pod "18e91592-f700-4c18-b07f-0abe2e262fd9" (UID: "18e91592-f700-4c18-b07f-0abe2e262fd9"). InnerVolumeSpecName "kube-api-access-qrtft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.540124 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18e91592-f700-4c18-b07f-0abe2e262fd9" (UID: "18e91592-f700-4c18-b07f-0abe2e262fd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.560382 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18e91592-f700-4c18-b07f-0abe2e262fd9" (UID: "18e91592-f700-4c18-b07f-0abe2e262fd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.560913 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-config" (OuterVolumeSpecName: "config") pod "18e91592-f700-4c18-b07f-0abe2e262fd9" (UID: "18e91592-f700-4c18-b07f-0abe2e262fd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.561250 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.563672 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrtft\" (UniqueName: \"kubernetes.io/projected/18e91592-f700-4c18-b07f-0abe2e262fd9-kube-api-access-qrtft\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.563713 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.563723 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.564180 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18e91592-f700-4c18-b07f-0abe2e262fd9" (UID: "18e91592-f700-4c18-b07f-0abe2e262fd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.663519 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003ec4c6-ed7a-4965-8643-4bb9bae5a896" path="/var/lib/kubelet/pods/003ec4c6-ed7a-4965-8643-4bb9bae5a896/volumes" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.664305 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934a03ec-4884-4c40-a555-d1a988c9f60a" path="/var/lib/kubelet/pods/934a03ec-4884-4c40-a555-d1a988c9f60a/volumes" Oct 10 07:58:15 crc kubenswrapper[4822]: I1010 07:58:15.665181 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18e91592-f700-4c18-b07f-0abe2e262fd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:16 crc kubenswrapper[4822]: I1010 07:58:16.362625 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" event={"ID":"18e91592-f700-4c18-b07f-0abe2e262fd9","Type":"ContainerDied","Data":"1742a6f5c62da5729998476e833b5dd09aa0d32ef265731447bf969685ba5fac"} Oct 10 07:58:16 crc kubenswrapper[4822]: I1010 07:58:16.362663 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7869c9d85c-xv8n5" Oct 10 07:58:16 crc kubenswrapper[4822]: I1010 07:58:16.383734 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7869c9d85c-xv8n5"] Oct 10 07:58:16 crc kubenswrapper[4822]: I1010 07:58:16.391571 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7869c9d85c-xv8n5"] Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.372264 4822 generic.go:334] "Generic (PLEG): container finished" podID="a28eed5e-47a9-4935-a7c7-876994710f01" containerID="1691843ee79b16d9ed91cbb09ef72834b05b15c06ffc039415197ae009fb180d" exitCode=0 Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.372305 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a28eed5e-47a9-4935-a7c7-876994710f01","Type":"ContainerDied","Data":"1691843ee79b16d9ed91cbb09ef72834b05b15c06ffc039415197ae009fb180d"} Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.372332 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a28eed5e-47a9-4935-a7c7-876994710f01","Type":"ContainerDied","Data":"1997172b07d52d605cdc3d8d23d2e21615c3d7204958c7e714860aaef7ecaa02"} Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.372343 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1997172b07d52d605cdc3d8d23d2e21615c3d7204958c7e714860aaef7ecaa02" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.397837 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.501471 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgjq8\" (UniqueName: \"kubernetes.io/projected/a28eed5e-47a9-4935-a7c7-876994710f01-kube-api-access-vgjq8\") pod \"a28eed5e-47a9-4935-a7c7-876994710f01\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.501616 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-config-data\") pod \"a28eed5e-47a9-4935-a7c7-876994710f01\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.501694 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-combined-ca-bundle\") pod \"a28eed5e-47a9-4935-a7c7-876994710f01\" (UID: \"a28eed5e-47a9-4935-a7c7-876994710f01\") " Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.507687 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28eed5e-47a9-4935-a7c7-876994710f01-kube-api-access-vgjq8" (OuterVolumeSpecName: "kube-api-access-vgjq8") pod "a28eed5e-47a9-4935-a7c7-876994710f01" (UID: "a28eed5e-47a9-4935-a7c7-876994710f01"). InnerVolumeSpecName "kube-api-access-vgjq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.530387 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-config-data" (OuterVolumeSpecName: "config-data") pod "a28eed5e-47a9-4935-a7c7-876994710f01" (UID: "a28eed5e-47a9-4935-a7c7-876994710f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.532421 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a28eed5e-47a9-4935-a7c7-876994710f01" (UID: "a28eed5e-47a9-4935-a7c7-876994710f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.603720 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.603760 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28eed5e-47a9-4935-a7c7-876994710f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.603775 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgjq8\" (UniqueName: \"kubernetes.io/projected/a28eed5e-47a9-4935-a7c7-876994710f01-kube-api-access-vgjq8\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:17 crc kubenswrapper[4822]: I1010 07:58:17.677650 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" path="/var/lib/kubelet/pods/18e91592-f700-4c18-b07f-0abe2e262fd9/volumes" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.399723 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.431984 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.457753 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.469948 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:18 crc kubenswrapper[4822]: E1010 07:58:18.470368 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerName="init" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.470384 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerName="init" Oct 10 07:58:18 crc kubenswrapper[4822]: E1010 07:58:18.470428 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerName="dnsmasq-dns" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.470436 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerName="dnsmasq-dns" Oct 10 07:58:18 crc kubenswrapper[4822]: E1010 07:58:18.470444 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28eed5e-47a9-4935-a7c7-876994710f01" containerName="nova-scheduler-scheduler" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.470451 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28eed5e-47a9-4935-a7c7-876994710f01" containerName="nova-scheduler-scheduler" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.470619 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28eed5e-47a9-4935-a7c7-876994710f01" containerName="nova-scheduler-scheduler" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.470639 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e91592-f700-4c18-b07f-0abe2e262fd9" containerName="dnsmasq-dns" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.471371 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.473168 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.478721 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.622130 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tppzt\" (UniqueName: \"kubernetes.io/projected/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-kube-api-access-tppzt\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.622622 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.622894 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-config-data\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.725152 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tppzt\" (UniqueName: \"kubernetes.io/projected/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-kube-api-access-tppzt\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.725497 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.725597 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-config-data\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.745593 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-config-data\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.745862 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.749231 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tppzt\" (UniqueName: \"kubernetes.io/projected/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-kube-api-access-tppzt\") pod \"nova-scheduler-0\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:18 crc kubenswrapper[4822]: I1010 07:58:18.794736 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:19 crc kubenswrapper[4822]: I1010 07:58:19.077438 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:58:19 crc kubenswrapper[4822]: I1010 07:58:19.077892 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:58:19 crc kubenswrapper[4822]: I1010 07:58:19.340553 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:19 crc kubenswrapper[4822]: W1010 07:58:19.340694 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd13af0f7_8d2e_4168_b815_0e6f9a2f2f55.slice/crio-8b9a84797a2d36af6a2d9c9c52f5e7b7cbd7162f41f725021c4266add4ebb0db WatchSource:0}: Error finding container 8b9a84797a2d36af6a2d9c9c52f5e7b7cbd7162f41f725021c4266add4ebb0db: Status 404 returned error can't find the container with id 8b9a84797a2d36af6a2d9c9c52f5e7b7cbd7162f41f725021c4266add4ebb0db Oct 10 07:58:19 crc kubenswrapper[4822]: I1010 07:58:19.410822 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55","Type":"ContainerStarted","Data":"8b9a84797a2d36af6a2d9c9c52f5e7b7cbd7162f41f725021c4266add4ebb0db"} Oct 10 07:58:19 crc kubenswrapper[4822]: I1010 07:58:19.664267 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a28eed5e-47a9-4935-a7c7-876994710f01" path="/var/lib/kubelet/pods/a28eed5e-47a9-4935-a7c7-876994710f01/volumes" Oct 10 07:58:20 crc kubenswrapper[4822]: I1010 07:58:20.422742 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55","Type":"ContainerStarted","Data":"8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8"} Oct 10 07:58:20 crc kubenswrapper[4822]: I1010 07:58:20.444335 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.444316788 podStartE2EDuration="2.444316788s" podCreationTimestamp="2025-10-10 07:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:20.441433025 +0000 UTC m=+5647.536591251" watchObservedRunningTime="2025-10-10 07:58:20.444316788 +0000 UTC m=+5647.539474984" Oct 10 07:58:22 crc kubenswrapper[4822]: I1010 07:58:22.731329 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.222301 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zv84p"] Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.223532 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.227193 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.227503 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.232747 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zv84p"] Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.307375 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-config-data\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.307703 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.307842 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527jf\" (UniqueName: \"kubernetes.io/projected/528e4fef-6ff0-4f90-9ef0-5f50840bef69-kube-api-access-527jf\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.307944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-scripts\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.409687 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.409775 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-527jf\" (UniqueName: \"kubernetes.io/projected/528e4fef-6ff0-4f90-9ef0-5f50840bef69-kube-api-access-527jf\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.409856 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-scripts\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.409915 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-config-data\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.415780 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-scripts\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.416517 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-config-data\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.425026 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-527jf\" (UniqueName: \"kubernetes.io/projected/528e4fef-6ff0-4f90-9ef0-5f50840bef69-kube-api-access-527jf\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.426321 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zv84p\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.554915 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:23 crc kubenswrapper[4822]: I1010 07:58:23.795088 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.012114 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zv84p"] Oct 10 07:58:24 crc kubenswrapper[4822]: W1010 07:58:24.021208 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod528e4fef_6ff0_4f90_9ef0_5f50840bef69.slice/crio-5b1e285bcbefc7c75829b4cfaa26d61d2c75ceb521fe5176a2c3a9436b56cb58 WatchSource:0}: Error finding container 5b1e285bcbefc7c75829b4cfaa26d61d2c75ceb521fe5176a2c3a9436b56cb58: Status 404 returned error can't find the container with id 5b1e285bcbefc7c75829b4cfaa26d61d2c75ceb521fe5176a2c3a9436b56cb58 Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.078435 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.078739 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.102594 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.102705 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.466500 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zv84p" event={"ID":"528e4fef-6ff0-4f90-9ef0-5f50840bef69","Type":"ContainerStarted","Data":"6ca8f626cd853496ad9a6f935f13458a46f259efba30be5c170966de087e4dff"} Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.466547 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zv84p" event={"ID":"528e4fef-6ff0-4f90-9ef0-5f50840bef69","Type":"ContainerStarted","Data":"5b1e285bcbefc7c75829b4cfaa26d61d2c75ceb521fe5176a2c3a9436b56cb58"} Oct 10 07:58:24 crc kubenswrapper[4822]: I1010 07:58:24.488518 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zv84p" podStartSLOduration=1.48849652 podStartE2EDuration="1.48849652s" podCreationTimestamp="2025-10-10 07:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:24.488242902 +0000 UTC m=+5651.583401128" watchObservedRunningTime="2025-10-10 07:58:24.48849652 +0000 UTC m=+5651.583654726" Oct 10 07:58:25 crc kubenswrapper[4822]: I1010 07:58:25.245013 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:25 crc kubenswrapper[4822]: I1010 07:58:25.245528 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:25 crc kubenswrapper[4822]: I1010 07:58:25.246618 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:25 crc kubenswrapper[4822]: I1010 07:58:25.247179 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:28 crc kubenswrapper[4822]: I1010 07:58:28.795973 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 07:58:28 crc kubenswrapper[4822]: I1010 07:58:28.826603 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 07:58:29 crc kubenswrapper[4822]: I1010 07:58:29.511478 4822 generic.go:334] "Generic (PLEG): container finished" podID="528e4fef-6ff0-4f90-9ef0-5f50840bef69" containerID="6ca8f626cd853496ad9a6f935f13458a46f259efba30be5c170966de087e4dff" exitCode=0 Oct 10 07:58:29 crc kubenswrapper[4822]: I1010 07:58:29.511598 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zv84p" event={"ID":"528e4fef-6ff0-4f90-9ef0-5f50840bef69","Type":"ContainerDied","Data":"6ca8f626cd853496ad9a6f935f13458a46f259efba30be5c170966de087e4dff"} Oct 10 07:58:29 crc kubenswrapper[4822]: I1010 07:58:29.544616 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 07:58:30 crc kubenswrapper[4822]: I1010 07:58:30.886082 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.063075 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-combined-ca-bundle\") pod \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.063191 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-config-data\") pod \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.063229 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-527jf\" (UniqueName: \"kubernetes.io/projected/528e4fef-6ff0-4f90-9ef0-5f50840bef69-kube-api-access-527jf\") pod \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.063267 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-scripts\") pod \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\" (UID: \"528e4fef-6ff0-4f90-9ef0-5f50840bef69\") " Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.068730 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528e4fef-6ff0-4f90-9ef0-5f50840bef69-kube-api-access-527jf" (OuterVolumeSpecName: "kube-api-access-527jf") pod "528e4fef-6ff0-4f90-9ef0-5f50840bef69" (UID: "528e4fef-6ff0-4f90-9ef0-5f50840bef69"). InnerVolumeSpecName "kube-api-access-527jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.070189 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-scripts" (OuterVolumeSpecName: "scripts") pod "528e4fef-6ff0-4f90-9ef0-5f50840bef69" (UID: "528e4fef-6ff0-4f90-9ef0-5f50840bef69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.097275 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "528e4fef-6ff0-4f90-9ef0-5f50840bef69" (UID: "528e4fef-6ff0-4f90-9ef0-5f50840bef69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.097959 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-config-data" (OuterVolumeSpecName: "config-data") pod "528e4fef-6ff0-4f90-9ef0-5f50840bef69" (UID: "528e4fef-6ff0-4f90-9ef0-5f50840bef69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.166303 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.166359 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-527jf\" (UniqueName: \"kubernetes.io/projected/528e4fef-6ff0-4f90-9ef0-5f50840bef69-kube-api-access-527jf\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.166381 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.166397 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e4fef-6ff0-4f90-9ef0-5f50840bef69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.336856 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.337003 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.534957 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zv84p" event={"ID":"528e4fef-6ff0-4f90-9ef0-5f50840bef69","Type":"ContainerDied","Data":"5b1e285bcbefc7c75829b4cfaa26d61d2c75ceb521fe5176a2c3a9436b56cb58"} Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.535210 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b1e285bcbefc7c75829b4cfaa26d61d2c75ceb521fe5176a2c3a9436b56cb58" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.535056 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zv84p" Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.747138 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.747552 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-log" containerID="cri-o://ab50881e4c0a95433c8cd8c820df7f8480d19abd4f7e0313993a4ae45eb3af72" gracePeriod=30 Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.747630 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-api" containerID="cri-o://8e9444ccffb08cf7b7626adaf5323aa218664f9a98b08a30e4aa89463381dc49" gracePeriod=30 Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.764663 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.764957 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" containerName="nova-scheduler-scheduler" containerID="cri-o://8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8" gracePeriod=30 Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.795629 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.795927 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-log" containerID="cri-o://02f29dbbdd4d20d1cd93a87bdbbfa002f28a297549e5649907c5d8794668905c" gracePeriod=30 Oct 10 07:58:31 crc kubenswrapper[4822]: I1010 07:58:31.796023 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-metadata" containerID="cri-o://5a8591629271912f5e22698b1ab1055c17e8d885204e7cdedac05639d64137d5" gracePeriod=30 Oct 10 07:58:32 crc kubenswrapper[4822]: I1010 07:58:32.545084 4822 generic.go:334] "Generic (PLEG): container finished" podID="6081be80-9e45-46d3-9605-0c7f834cf385" containerID="02f29dbbdd4d20d1cd93a87bdbbfa002f28a297549e5649907c5d8794668905c" exitCode=143 Oct 10 07:58:32 crc kubenswrapper[4822]: I1010 07:58:32.545208 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6081be80-9e45-46d3-9605-0c7f834cf385","Type":"ContainerDied","Data":"02f29dbbdd4d20d1cd93a87bdbbfa002f28a297549e5649907c5d8794668905c"} Oct 10 07:58:32 crc kubenswrapper[4822]: I1010 07:58:32.548953 4822 generic.go:334] "Generic (PLEG): container finished" podID="dead44ba-016f-4b64-803a-26ff3680b619" containerID="ab50881e4c0a95433c8cd8c820df7f8480d19abd4f7e0313993a4ae45eb3af72" exitCode=143 Oct 10 07:58:32 crc kubenswrapper[4822]: I1010 07:58:32.548986 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dead44ba-016f-4b64-803a-26ff3680b619","Type":"ContainerDied","Data":"ab50881e4c0a95433c8cd8c820df7f8480d19abd4f7e0313993a4ae45eb3af72"} Oct 10 07:58:33 crc kubenswrapper[4822]: E1010 07:58:33.798069 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:58:33 crc kubenswrapper[4822]: E1010 07:58:33.800342 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:58:33 crc kubenswrapper[4822]: E1010 07:58:33.801921 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:58:33 crc kubenswrapper[4822]: E1010 07:58:33.802045 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" containerName="nova-scheduler-scheduler" Oct 10 07:58:35 crc kubenswrapper[4822]: I1010 07:58:35.587189 4822 generic.go:334] "Generic (PLEG): container finished" podID="6081be80-9e45-46d3-9605-0c7f834cf385" containerID="5a8591629271912f5e22698b1ab1055c17e8d885204e7cdedac05639d64137d5" exitCode=0 Oct 10 07:58:35 crc kubenswrapper[4822]: I1010 07:58:35.587320 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6081be80-9e45-46d3-9605-0c7f834cf385","Type":"ContainerDied","Data":"5a8591629271912f5e22698b1ab1055c17e8d885204e7cdedac05639d64137d5"} Oct 10 07:58:35 crc kubenswrapper[4822]: I1010 07:58:35.590200 4822 generic.go:334] "Generic (PLEG): container finished" podID="dead44ba-016f-4b64-803a-26ff3680b619" containerID="8e9444ccffb08cf7b7626adaf5323aa218664f9a98b08a30e4aa89463381dc49" exitCode=0 Oct 10 07:58:35 crc kubenswrapper[4822]: I1010 07:58:35.590260 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dead44ba-016f-4b64-803a-26ff3680b619","Type":"ContainerDied","Data":"8e9444ccffb08cf7b7626adaf5323aa218664f9a98b08a30e4aa89463381dc49"} Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.389489 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.395395 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.566949 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75h2z\" (UniqueName: \"kubernetes.io/projected/dead44ba-016f-4b64-803a-26ff3680b619-kube-api-access-75h2z\") pod \"dead44ba-016f-4b64-803a-26ff3680b619\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567087 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6081be80-9e45-46d3-9605-0c7f834cf385-logs\") pod \"6081be80-9e45-46d3-9605-0c7f834cf385\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567133 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-config-data\") pod \"6081be80-9e45-46d3-9605-0c7f834cf385\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567188 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvmw7\" (UniqueName: \"kubernetes.io/projected/6081be80-9e45-46d3-9605-0c7f834cf385-kube-api-access-lvmw7\") pod \"6081be80-9e45-46d3-9605-0c7f834cf385\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567296 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-config-data\") pod \"dead44ba-016f-4b64-803a-26ff3680b619\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567342 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-combined-ca-bundle\") pod \"6081be80-9e45-46d3-9605-0c7f834cf385\" (UID: \"6081be80-9e45-46d3-9605-0c7f834cf385\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567400 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead44ba-016f-4b64-803a-26ff3680b619-logs\") pod \"dead44ba-016f-4b64-803a-26ff3680b619\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567429 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-combined-ca-bundle\") pod \"dead44ba-016f-4b64-803a-26ff3680b619\" (UID: \"dead44ba-016f-4b64-803a-26ff3680b619\") " Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.567917 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6081be80-9e45-46d3-9605-0c7f834cf385-logs" (OuterVolumeSpecName: "logs") pod "6081be80-9e45-46d3-9605-0c7f834cf385" (UID: "6081be80-9e45-46d3-9605-0c7f834cf385"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.568183 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6081be80-9e45-46d3-9605-0c7f834cf385-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.568471 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead44ba-016f-4b64-803a-26ff3680b619-logs" (OuterVolumeSpecName: "logs") pod "dead44ba-016f-4b64-803a-26ff3680b619" (UID: "dead44ba-016f-4b64-803a-26ff3680b619"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.577123 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6081be80-9e45-46d3-9605-0c7f834cf385-kube-api-access-lvmw7" (OuterVolumeSpecName: "kube-api-access-lvmw7") pod "6081be80-9e45-46d3-9605-0c7f834cf385" (UID: "6081be80-9e45-46d3-9605-0c7f834cf385"). InnerVolumeSpecName "kube-api-access-lvmw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.579355 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dead44ba-016f-4b64-803a-26ff3680b619-kube-api-access-75h2z" (OuterVolumeSpecName: "kube-api-access-75h2z") pod "dead44ba-016f-4b64-803a-26ff3680b619" (UID: "dead44ba-016f-4b64-803a-26ff3680b619"). InnerVolumeSpecName "kube-api-access-75h2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.591076 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-config-data" (OuterVolumeSpecName: "config-data") pod "dead44ba-016f-4b64-803a-26ff3680b619" (UID: "dead44ba-016f-4b64-803a-26ff3680b619"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.600086 4822 generic.go:334] "Generic (PLEG): container finished" podID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" containerID="8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8" exitCode=0 Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.600152 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55","Type":"ContainerDied","Data":"8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8"} Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.602990 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6081be80-9e45-46d3-9605-0c7f834cf385","Type":"ContainerDied","Data":"7d44c897f98b98756d1c33fceea154ba2fa81a3c2a77dd3a7919eb6619a602c6"} Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.603023 4822 scope.go:117] "RemoveContainer" containerID="5a8591629271912f5e22698b1ab1055c17e8d885204e7cdedac05639d64137d5" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.603059 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.603502 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dead44ba-016f-4b64-803a-26ff3680b619" (UID: "dead44ba-016f-4b64-803a-26ff3680b619"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.603688 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6081be80-9e45-46d3-9605-0c7f834cf385" (UID: "6081be80-9e45-46d3-9605-0c7f834cf385"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.605129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dead44ba-016f-4b64-803a-26ff3680b619","Type":"ContainerDied","Data":"76029e8ab522512c1ff20e2ff7c23c6d2d705313205660990392b5ba5f136cef"} Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.605192 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.615770 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-config-data" (OuterVolumeSpecName: "config-data") pod "6081be80-9e45-46d3-9605-0c7f834cf385" (UID: "6081be80-9e45-46d3-9605-0c7f834cf385"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670071 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670116 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvmw7\" (UniqueName: \"kubernetes.io/projected/6081be80-9e45-46d3-9605-0c7f834cf385-kube-api-access-lvmw7\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670131 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670142 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6081be80-9e45-46d3-9605-0c7f834cf385-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670153 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dead44ba-016f-4b64-803a-26ff3680b619-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670164 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dead44ba-016f-4b64-803a-26ff3680b619-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.670177 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75h2z\" (UniqueName: \"kubernetes.io/projected/dead44ba-016f-4b64-803a-26ff3680b619-kube-api-access-75h2z\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.689136 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.702554 4822 scope.go:117] "RemoveContainer" containerID="02f29dbbdd4d20d1cd93a87bdbbfa002f28a297549e5649907c5d8794668905c" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.715291 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.726175 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: E1010 07:58:36.726854 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-metadata" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.726944 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-metadata" Oct 10 07:58:36 crc kubenswrapper[4822]: E1010 07:58:36.727014 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528e4fef-6ff0-4f90-9ef0-5f50840bef69" containerName="nova-manage" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727063 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="528e4fef-6ff0-4f90-9ef0-5f50840bef69" containerName="nova-manage" Oct 10 07:58:36 crc kubenswrapper[4822]: E1010 07:58:36.727137 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-api" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727206 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-api" Oct 10 07:58:36 crc kubenswrapper[4822]: E1010 07:58:36.727281 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-log" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727350 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-log" Oct 10 07:58:36 crc kubenswrapper[4822]: E1010 07:58:36.727424 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-log" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727488 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-log" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727731 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-metadata" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727822 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-api" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727882 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="528e4fef-6ff0-4f90-9ef0-5f50840bef69" containerName="nova-manage" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.727946 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" containerName="nova-metadata-log" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.728002 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dead44ba-016f-4b64-803a-26ff3680b619" containerName="nova-api-log" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.728753 4822 scope.go:117] "RemoveContainer" containerID="8e9444ccffb08cf7b7626adaf5323aa218664f9a98b08a30e4aa89463381dc49" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.729438 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.731732 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.736130 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.760820 4822 scope.go:117] "RemoveContainer" containerID="ab50881e4c0a95433c8cd8c820df7f8480d19abd4f7e0313993a4ae45eb3af72" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.889507 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dc003fc-2bcf-4485-8024-62783b2a6e83-logs\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.889640 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89xrs\" (UniqueName: \"kubernetes.io/projected/7dc003fc-2bcf-4485-8024-62783b2a6e83-kube-api-access-89xrs\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.889843 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-config-data\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.889930 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.968386 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.978863 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.988229 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.989637 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991781 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dc003fc-2bcf-4485-8024-62783b2a6e83-logs\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991835 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89xrs\" (UniqueName: \"kubernetes.io/projected/7dc003fc-2bcf-4485-8024-62783b2a6e83-kube-api-access-89xrs\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991860 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-config-data\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991881 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4f8712-93f7-4440-a8f2-6407a49ea48d-logs\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991907 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-config-data\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991927 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.991977 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.992015 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96m65\" (UniqueName: \"kubernetes.io/projected/ef4f8712-93f7-4440-a8f2-6407a49ea48d-kube-api-access-96m65\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.992397 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dc003fc-2bcf-4485-8024-62783b2a6e83-logs\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.996400 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:58:36 crc kubenswrapper[4822]: I1010 07:58:36.998858 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-config-data\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.001724 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.007848 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.019229 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89xrs\" (UniqueName: \"kubernetes.io/projected/7dc003fc-2bcf-4485-8024-62783b2a6e83-kube-api-access-89xrs\") pod \"nova-api-0\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " pod="openstack/nova-api-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.056829 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.093868 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96m65\" (UniqueName: \"kubernetes.io/projected/ef4f8712-93f7-4440-a8f2-6407a49ea48d-kube-api-access-96m65\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.094006 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-config-data\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.094066 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4f8712-93f7-4440-a8f2-6407a49ea48d-logs\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.094165 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.095159 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4f8712-93f7-4440-a8f2-6407a49ea48d-logs\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.099968 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-config-data\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.101553 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.103689 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.110073 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96m65\" (UniqueName: \"kubernetes.io/projected/ef4f8712-93f7-4440-a8f2-6407a49ea48d-kube-api-access-96m65\") pod \"nova-metadata-0\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.297797 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-combined-ca-bundle\") pod \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.298361 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tppzt\" (UniqueName: \"kubernetes.io/projected/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-kube-api-access-tppzt\") pod \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.298426 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-config-data\") pod \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\" (UID: \"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55\") " Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.304684 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-kube-api-access-tppzt" (OuterVolumeSpecName: "kube-api-access-tppzt") pod "d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" (UID: "d13af0f7-8d2e-4168-b815-0e6f9a2f2f55"). InnerVolumeSpecName "kube-api-access-tppzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.317478 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.344266 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-config-data" (OuterVolumeSpecName: "config-data") pod "d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" (UID: "d13af0f7-8d2e-4168-b815-0e6f9a2f2f55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.346917 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" (UID: "d13af0f7-8d2e-4168-b815-0e6f9a2f2f55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.400711 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.400740 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.400753 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tppzt\" (UniqueName: \"kubernetes.io/projected/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55-kube-api-access-tppzt\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.525660 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.614597 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7dc003fc-2bcf-4485-8024-62783b2a6e83","Type":"ContainerStarted","Data":"1ab5c3409a19bde3714b866f98a3572245fcc14bc8e937f4b1fbf641fc2873bc"} Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.617010 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.617017 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d13af0f7-8d2e-4168-b815-0e6f9a2f2f55","Type":"ContainerDied","Data":"8b9a84797a2d36af6a2d9c9c52f5e7b7cbd7162f41f725021c4266add4ebb0db"} Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.617101 4822 scope.go:117] "RemoveContainer" containerID="8445cc9adad98df9507e48c2f7fb59d3f8bd062418e551f1a9eca6d5e1b22dc8" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.677566 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6081be80-9e45-46d3-9605-0c7f834cf385" path="/var/lib/kubelet/pods/6081be80-9e45-46d3-9605-0c7f834cf385/volumes" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.679478 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dead44ba-016f-4b64-803a-26ff3680b619" path="/var/lib/kubelet/pods/dead44ba-016f-4b64-803a-26ff3680b619/volumes" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.682541 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.694864 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.747585 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: E1010 07:58:37.748340 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" containerName="nova-scheduler-scheduler" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.748368 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" containerName="nova-scheduler-scheduler" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.748780 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" containerName="nova-scheduler-scheduler" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.750186 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.752277 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.759592 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.797419 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.944218 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-config-data\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.944501 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:37 crc kubenswrapper[4822]: I1010 07:58:37.944644 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnn9k\" (UniqueName: \"kubernetes.io/projected/84f37f7a-3fb2-4fca-b877-57c6038e176b-kube-api-access-hnn9k\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.046267 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-config-data\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.046368 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.046413 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnn9k\" (UniqueName: \"kubernetes.io/projected/84f37f7a-3fb2-4fca-b877-57c6038e176b-kube-api-access-hnn9k\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.051225 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-config-data\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.053643 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.063473 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnn9k\" (UniqueName: \"kubernetes.io/projected/84f37f7a-3fb2-4fca-b877-57c6038e176b-kube-api-access-hnn9k\") pod \"nova-scheduler-0\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.075854 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.512134 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.631599 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7dc003fc-2bcf-4485-8024-62783b2a6e83","Type":"ContainerStarted","Data":"4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44"} Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.631642 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7dc003fc-2bcf-4485-8024-62783b2a6e83","Type":"ContainerStarted","Data":"78c86f0f15a74965de4c06ebe75c6115730467e5b9b70b39f0a30d9944285bbf"} Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.643044 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef4f8712-93f7-4440-a8f2-6407a49ea48d","Type":"ContainerStarted","Data":"7b797a260d5e5d69497019659f7dc12bf7a7bfbd5eabd8ee7463750fbac87bd0"} Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.643103 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef4f8712-93f7-4440-a8f2-6407a49ea48d","Type":"ContainerStarted","Data":"11f62a5cb5fcd6484bb29ef93a60cc272a3e044520425a81becb5bfff795404d"} Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.643118 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef4f8712-93f7-4440-a8f2-6407a49ea48d","Type":"ContainerStarted","Data":"cfa5bf038fb3d2df92cd63ba013dad65178837fa031b0b9dc1b2f5c4a2a0d595"} Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.647119 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84f37f7a-3fb2-4fca-b877-57c6038e176b","Type":"ContainerStarted","Data":"57e8c36a12d1dae5805f3201a8a662d04935f176c3847a071af2b4c410d8b0ef"} Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.653545 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6535289520000003 podStartE2EDuration="2.653528952s" podCreationTimestamp="2025-10-10 07:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:38.649344972 +0000 UTC m=+5665.744503168" watchObservedRunningTime="2025-10-10 07:58:38.653528952 +0000 UTC m=+5665.748687148" Oct 10 07:58:38 crc kubenswrapper[4822]: I1010 07:58:38.671567 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.671547102 podStartE2EDuration="2.671547102s" podCreationTimestamp="2025-10-10 07:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:38.671359046 +0000 UTC m=+5665.766517242" watchObservedRunningTime="2025-10-10 07:58:38.671547102 +0000 UTC m=+5665.766705308" Oct 10 07:58:39 crc kubenswrapper[4822]: I1010 07:58:39.662147 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13af0f7-8d2e-4168-b815-0e6f9a2f2f55" path="/var/lib/kubelet/pods/d13af0f7-8d2e-4168-b815-0e6f9a2f2f55/volumes" Oct 10 07:58:39 crc kubenswrapper[4822]: I1010 07:58:39.663251 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84f37f7a-3fb2-4fca-b877-57c6038e176b","Type":"ContainerStarted","Data":"4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c"} Oct 10 07:58:39 crc kubenswrapper[4822]: I1010 07:58:39.686907 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.686890976 podStartE2EDuration="2.686890976s" podCreationTimestamp="2025-10-10 07:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:58:39.678065692 +0000 UTC m=+5666.773223898" watchObservedRunningTime="2025-10-10 07:58:39.686890976 +0000 UTC m=+5666.782049172" Oct 10 07:58:42 crc kubenswrapper[4822]: I1010 07:58:42.318187 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:58:42 crc kubenswrapper[4822]: I1010 07:58:42.318520 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:58:43 crc kubenswrapper[4822]: I1010 07:58:43.076275 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:58:47 crc kubenswrapper[4822]: I1010 07:58:47.057848 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:58:47 crc kubenswrapper[4822]: I1010 07:58:47.058894 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:58:47 crc kubenswrapper[4822]: I1010 07:58:47.318969 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:58:47 crc kubenswrapper[4822]: I1010 07:58:47.319021 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.077022 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.099107 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.099108 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.110277 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.402008 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.402032 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:58:48 crc kubenswrapper[4822]: I1010 07:58:48.777776 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.062183 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.063409 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.064544 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.067875 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.321791 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.322330 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.325428 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.836768 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.839873 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 07:58:57 crc kubenswrapper[4822]: I1010 07:58:57.841015 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.079375 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb"] Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.081246 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.101135 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb"] Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.263765 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-config\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.264403 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.264507 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-dns-svc\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.264580 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.264902 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fgxg\" (UniqueName: \"kubernetes.io/projected/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-kube-api-access-8fgxg\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.367160 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fgxg\" (UniqueName: \"kubernetes.io/projected/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-kube-api-access-8fgxg\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.367297 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-config\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.367348 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.367434 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-dns-svc\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.367517 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.368672 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-config\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.369203 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-dns-svc\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.369735 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.375151 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.393940 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fgxg\" (UniqueName: \"kubernetes.io/projected/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-kube-api-access-8fgxg\") pod \"dnsmasq-dns-6cf8bfcd7c-kc8cb\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.403676 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:58:58 crc kubenswrapper[4822]: I1010 07:58:58.886486 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb"] Oct 10 07:58:58 crc kubenswrapper[4822]: W1010 07:58:58.888417 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded464cd5_17cb_42e1_811f_ffa7ab6b33f3.slice/crio-222cfb0fdaea8dad275c2a98b5d854775b2209f291739121fc6edc0f65c05334 WatchSource:0}: Error finding container 222cfb0fdaea8dad275c2a98b5d854775b2209f291739121fc6edc0f65c05334: Status 404 returned error can't find the container with id 222cfb0fdaea8dad275c2a98b5d854775b2209f291739121fc6edc0f65c05334 Oct 10 07:58:59 crc kubenswrapper[4822]: E1010 07:58:59.318248 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded464cd5_17cb_42e1_811f_ffa7ab6b33f3.slice/crio-conmon-3d45c80b386d7e2dfccf3e253852025b59c7c8c27e29ccdac0ad814a553aae13.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:58:59 crc kubenswrapper[4822]: I1010 07:58:59.852969 4822 generic.go:334] "Generic (PLEG): container finished" podID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerID="3d45c80b386d7e2dfccf3e253852025b59c7c8c27e29ccdac0ad814a553aae13" exitCode=0 Oct 10 07:58:59 crc kubenswrapper[4822]: I1010 07:58:59.853041 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" event={"ID":"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3","Type":"ContainerDied","Data":"3d45c80b386d7e2dfccf3e253852025b59c7c8c27e29ccdac0ad814a553aae13"} Oct 10 07:58:59 crc kubenswrapper[4822]: I1010 07:58:59.853381 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" event={"ID":"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3","Type":"ContainerStarted","Data":"222cfb0fdaea8dad275c2a98b5d854775b2209f291739121fc6edc0f65c05334"} Oct 10 07:59:00 crc kubenswrapper[4822]: I1010 07:59:00.867262 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" event={"ID":"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3","Type":"ContainerStarted","Data":"90fc98cecdbfb86d797ef58147f771673badb718abd68c71aac864d0390b6164"} Oct 10 07:59:00 crc kubenswrapper[4822]: I1010 07:59:00.867891 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:59:00 crc kubenswrapper[4822]: I1010 07:59:00.889885 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" podStartSLOduration=2.8898659049999997 podStartE2EDuration="2.889865905s" podCreationTimestamp="2025-10-10 07:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:00.882669928 +0000 UTC m=+5687.977828144" watchObservedRunningTime="2025-10-10 07:59:00.889865905 +0000 UTC m=+5687.985024101" Oct 10 07:59:01 crc kubenswrapper[4822]: I1010 07:59:01.337258 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:59:01 crc kubenswrapper[4822]: I1010 07:59:01.337335 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.405995 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.482630 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc866d9fc-fdpnr"] Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.487435 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerName="dnsmasq-dns" containerID="cri-o://113da4b3073efe7f9ca2fe6d65dd40b1f912a8bdaf36a9dd554f0c85ad823093" gracePeriod=10 Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.950582 4822 generic.go:334] "Generic (PLEG): container finished" podID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerID="113da4b3073efe7f9ca2fe6d65dd40b1f912a8bdaf36a9dd554f0c85ad823093" exitCode=0 Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.951008 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" event={"ID":"427370c4-f63e-43d6-a48b-e5b64abd66be","Type":"ContainerDied","Data":"113da4b3073efe7f9ca2fe6d65dd40b1f912a8bdaf36a9dd554f0c85ad823093"} Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.951034 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" event={"ID":"427370c4-f63e-43d6-a48b-e5b64abd66be","Type":"ContainerDied","Data":"8cae83955167e90417dc631d24ac90edae520e8061bd4229236a988122d51eb8"} Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.951044 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cae83955167e90417dc631d24ac90edae520e8061bd4229236a988122d51eb8" Oct 10 07:59:08 crc kubenswrapper[4822]: I1010 07:59:08.962924 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.089929 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phlql\" (UniqueName: \"kubernetes.io/projected/427370c4-f63e-43d6-a48b-e5b64abd66be-kube-api-access-phlql\") pod \"427370c4-f63e-43d6-a48b-e5b64abd66be\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.090011 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-dns-svc\") pod \"427370c4-f63e-43d6-a48b-e5b64abd66be\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.090122 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-config\") pod \"427370c4-f63e-43d6-a48b-e5b64abd66be\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.090186 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-nb\") pod \"427370c4-f63e-43d6-a48b-e5b64abd66be\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.090234 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-sb\") pod \"427370c4-f63e-43d6-a48b-e5b64abd66be\" (UID: \"427370c4-f63e-43d6-a48b-e5b64abd66be\") " Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.095931 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427370c4-f63e-43d6-a48b-e5b64abd66be-kube-api-access-phlql" (OuterVolumeSpecName: "kube-api-access-phlql") pod "427370c4-f63e-43d6-a48b-e5b64abd66be" (UID: "427370c4-f63e-43d6-a48b-e5b64abd66be"). InnerVolumeSpecName "kube-api-access-phlql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.142028 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "427370c4-f63e-43d6-a48b-e5b64abd66be" (UID: "427370c4-f63e-43d6-a48b-e5b64abd66be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.143034 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "427370c4-f63e-43d6-a48b-e5b64abd66be" (UID: "427370c4-f63e-43d6-a48b-e5b64abd66be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.147688 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "427370c4-f63e-43d6-a48b-e5b64abd66be" (UID: "427370c4-f63e-43d6-a48b-e5b64abd66be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.151515 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-config" (OuterVolumeSpecName: "config") pod "427370c4-f63e-43d6-a48b-e5b64abd66be" (UID: "427370c4-f63e-43d6-a48b-e5b64abd66be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.192213 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phlql\" (UniqueName: \"kubernetes.io/projected/427370c4-f63e-43d6-a48b-e5b64abd66be-kube-api-access-phlql\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.192253 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.192266 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.192280 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.192293 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/427370c4-f63e-43d6-a48b-e5b64abd66be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.960122 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc866d9fc-fdpnr" Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.988891 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc866d9fc-fdpnr"] Oct 10 07:59:09 crc kubenswrapper[4822]: I1010 07:59:09.994892 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bc866d9fc-fdpnr"] Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.659000 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" path="/var/lib/kubelet/pods/427370c4-f63e-43d6-a48b-e5b64abd66be/volumes" Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.920760 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jj4c9"] Oct 10 07:59:11 crc kubenswrapper[4822]: E1010 07:59:11.921184 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerName="init" Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.921210 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerName="init" Oct 10 07:59:11 crc kubenswrapper[4822]: E1010 07:59:11.921235 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerName="dnsmasq-dns" Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.921242 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerName="dnsmasq-dns" Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.921423 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="427370c4-f63e-43d6-a48b-e5b64abd66be" containerName="dnsmasq-dns" Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.922066 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.931117 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jj4c9"] Oct 10 07:59:11 crc kubenswrapper[4822]: I1010 07:59:11.968925 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5h2m\" (UniqueName: \"kubernetes.io/projected/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13-kube-api-access-m5h2m\") pod \"cinder-db-create-jj4c9\" (UID: \"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13\") " pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.070536 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5h2m\" (UniqueName: \"kubernetes.io/projected/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13-kube-api-access-m5h2m\") pod \"cinder-db-create-jj4c9\" (UID: \"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13\") " pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.088327 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5h2m\" (UniqueName: \"kubernetes.io/projected/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13-kube-api-access-m5h2m\") pod \"cinder-db-create-jj4c9\" (UID: \"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13\") " pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.243129 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.718501 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jj4c9"] Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.987023 4822 generic.go:334] "Generic (PLEG): container finished" podID="d83dd9e7-73fd-49a2-b77f-389ffc3b6f13" containerID="fcf6e3f5439f7beaceadd9da96a2d5962cb486d6849e01c7cfca584bec68ec36" exitCode=0 Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.987067 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jj4c9" event={"ID":"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13","Type":"ContainerDied","Data":"fcf6e3f5439f7beaceadd9da96a2d5962cb486d6849e01c7cfca584bec68ec36"} Oct 10 07:59:12 crc kubenswrapper[4822]: I1010 07:59:12.987111 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jj4c9" event={"ID":"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13","Type":"ContainerStarted","Data":"fdbe276602026d393711efeb9311de853a3e1577c54dd7de521d82838cb8240c"} Oct 10 07:59:14 crc kubenswrapper[4822]: I1010 07:59:14.321786 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:14 crc kubenswrapper[4822]: I1010 07:59:14.416693 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5h2m\" (UniqueName: \"kubernetes.io/projected/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13-kube-api-access-m5h2m\") pod \"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13\" (UID: \"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13\") " Oct 10 07:59:14 crc kubenswrapper[4822]: I1010 07:59:14.426439 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13-kube-api-access-m5h2m" (OuterVolumeSpecName: "kube-api-access-m5h2m") pod "d83dd9e7-73fd-49a2-b77f-389ffc3b6f13" (UID: "d83dd9e7-73fd-49a2-b77f-389ffc3b6f13"). InnerVolumeSpecName "kube-api-access-m5h2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:14 crc kubenswrapper[4822]: I1010 07:59:14.518985 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5h2m\" (UniqueName: \"kubernetes.io/projected/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13-kube-api-access-m5h2m\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:15 crc kubenswrapper[4822]: I1010 07:59:15.004913 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jj4c9" event={"ID":"d83dd9e7-73fd-49a2-b77f-389ffc3b6f13","Type":"ContainerDied","Data":"fdbe276602026d393711efeb9311de853a3e1577c54dd7de521d82838cb8240c"} Oct 10 07:59:15 crc kubenswrapper[4822]: I1010 07:59:15.004960 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdbe276602026d393711efeb9311de853a3e1577c54dd7de521d82838cb8240c" Oct 10 07:59:15 crc kubenswrapper[4822]: I1010 07:59:15.004985 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jj4c9" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.080398 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fd16-account-create-dhss6"] Oct 10 07:59:22 crc kubenswrapper[4822]: E1010 07:59:22.083903 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83dd9e7-73fd-49a2-b77f-389ffc3b6f13" containerName="mariadb-database-create" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.084023 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83dd9e7-73fd-49a2-b77f-389ffc3b6f13" containerName="mariadb-database-create" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.084710 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83dd9e7-73fd-49a2-b77f-389ffc3b6f13" containerName="mariadb-database-create" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.094675 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.096395 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fd16-account-create-dhss6"] Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.100776 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.154504 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcrws\" (UniqueName: \"kubernetes.io/projected/439c4dfe-0c23-4109-8c55-202a6d68fa41-kube-api-access-pcrws\") pod \"cinder-fd16-account-create-dhss6\" (UID: \"439c4dfe-0c23-4109-8c55-202a6d68fa41\") " pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.256458 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcrws\" (UniqueName: \"kubernetes.io/projected/439c4dfe-0c23-4109-8c55-202a6d68fa41-kube-api-access-pcrws\") pod \"cinder-fd16-account-create-dhss6\" (UID: \"439c4dfe-0c23-4109-8c55-202a6d68fa41\") " pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.289538 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcrws\" (UniqueName: \"kubernetes.io/projected/439c4dfe-0c23-4109-8c55-202a6d68fa41-kube-api-access-pcrws\") pod \"cinder-fd16-account-create-dhss6\" (UID: \"439c4dfe-0c23-4109-8c55-202a6d68fa41\") " pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.424851 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:22 crc kubenswrapper[4822]: I1010 07:59:22.951432 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fd16-account-create-dhss6"] Oct 10 07:59:23 crc kubenswrapper[4822]: I1010 07:59:23.094389 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fd16-account-create-dhss6" event={"ID":"439c4dfe-0c23-4109-8c55-202a6d68fa41","Type":"ContainerStarted","Data":"8fcae076bb902e70764a05aed19b1c4311bc579cecde74649798a6e142a9f6ff"} Oct 10 07:59:24 crc kubenswrapper[4822]: I1010 07:59:24.105910 4822 generic.go:334] "Generic (PLEG): container finished" podID="439c4dfe-0c23-4109-8c55-202a6d68fa41" containerID="352be9c9d2e6f65fc4e5a6dc095ad8f858070b8e1964a64d04e36b73a81c8bea" exitCode=0 Oct 10 07:59:24 crc kubenswrapper[4822]: I1010 07:59:24.105977 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fd16-account-create-dhss6" event={"ID":"439c4dfe-0c23-4109-8c55-202a6d68fa41","Type":"ContainerDied","Data":"352be9c9d2e6f65fc4e5a6dc095ad8f858070b8e1964a64d04e36b73a81c8bea"} Oct 10 07:59:25 crc kubenswrapper[4822]: I1010 07:59:25.506920 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:25 crc kubenswrapper[4822]: I1010 07:59:25.548469 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcrws\" (UniqueName: \"kubernetes.io/projected/439c4dfe-0c23-4109-8c55-202a6d68fa41-kube-api-access-pcrws\") pod \"439c4dfe-0c23-4109-8c55-202a6d68fa41\" (UID: \"439c4dfe-0c23-4109-8c55-202a6d68fa41\") " Oct 10 07:59:25 crc kubenswrapper[4822]: I1010 07:59:25.554384 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439c4dfe-0c23-4109-8c55-202a6d68fa41-kube-api-access-pcrws" (OuterVolumeSpecName: "kube-api-access-pcrws") pod "439c4dfe-0c23-4109-8c55-202a6d68fa41" (UID: "439c4dfe-0c23-4109-8c55-202a6d68fa41"). InnerVolumeSpecName "kube-api-access-pcrws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:25 crc kubenswrapper[4822]: I1010 07:59:25.651719 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcrws\" (UniqueName: \"kubernetes.io/projected/439c4dfe-0c23-4109-8c55-202a6d68fa41-kube-api-access-pcrws\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:26 crc kubenswrapper[4822]: I1010 07:59:26.126706 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fd16-account-create-dhss6" event={"ID":"439c4dfe-0c23-4109-8c55-202a6d68fa41","Type":"ContainerDied","Data":"8fcae076bb902e70764a05aed19b1c4311bc579cecde74649798a6e142a9f6ff"} Oct 10 07:59:26 crc kubenswrapper[4822]: I1010 07:59:26.126750 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fcae076bb902e70764a05aed19b1c4311bc579cecde74649798a6e142a9f6ff" Oct 10 07:59:26 crc kubenswrapper[4822]: I1010 07:59:26.127022 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd16-account-create-dhss6" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.293475 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-pjxfs"] Oct 10 07:59:27 crc kubenswrapper[4822]: E1010 07:59:27.294200 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439c4dfe-0c23-4109-8c55-202a6d68fa41" containerName="mariadb-account-create" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.294215 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="439c4dfe-0c23-4109-8c55-202a6d68fa41" containerName="mariadb-account-create" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.294416 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="439c4dfe-0c23-4109-8c55-202a6d68fa41" containerName="mariadb-account-create" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.295199 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.300852 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.300935 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.301007 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s2sdf" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.305922 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-pjxfs"] Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.388194 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-db-sync-config-data\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.388247 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-scripts\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.388273 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-combined-ca-bundle\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.388404 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-config-data\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.388480 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frp9d\" (UniqueName: \"kubernetes.io/projected/1389c824-6a9a-4194-a74a-6b85d381a3df-kube-api-access-frp9d\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.388578 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1389c824-6a9a-4194-a74a-6b85d381a3df-etc-machine-id\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.490793 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1389c824-6a9a-4194-a74a-6b85d381a3df-etc-machine-id\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.490926 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-db-sync-config-data\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.490967 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-scripts\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.490995 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-combined-ca-bundle\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.491046 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-config-data\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.491084 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frp9d\" (UniqueName: \"kubernetes.io/projected/1389c824-6a9a-4194-a74a-6b85d381a3df-kube-api-access-frp9d\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.491497 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1389c824-6a9a-4194-a74a-6b85d381a3df-etc-machine-id\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.498387 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-scripts\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.499256 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-config-data\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.501359 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-db-sync-config-data\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.508167 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-combined-ca-bundle\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.511095 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frp9d\" (UniqueName: \"kubernetes.io/projected/1389c824-6a9a-4194-a74a-6b85d381a3df-kube-api-access-frp9d\") pod \"cinder-db-sync-pjxfs\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:27 crc kubenswrapper[4822]: I1010 07:59:27.616761 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:28 crc kubenswrapper[4822]: I1010 07:59:28.154361 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-pjxfs"] Oct 10 07:59:28 crc kubenswrapper[4822]: W1010 07:59:28.159281 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1389c824_6a9a_4194_a74a_6b85d381a3df.slice/crio-86fb321d31e0197f0dbf8446465a69c7ad633a6ddeddd0db37b0531efbf95714 WatchSource:0}: Error finding container 86fb321d31e0197f0dbf8446465a69c7ad633a6ddeddd0db37b0531efbf95714: Status 404 returned error can't find the container with id 86fb321d31e0197f0dbf8446465a69c7ad633a6ddeddd0db37b0531efbf95714 Oct 10 07:59:29 crc kubenswrapper[4822]: I1010 07:59:29.161236 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pjxfs" event={"ID":"1389c824-6a9a-4194-a74a-6b85d381a3df","Type":"ContainerStarted","Data":"14a75ec7486424f6d351c35963558619cc274df17be776d2456eddb0c40729f5"} Oct 10 07:59:29 crc kubenswrapper[4822]: I1010 07:59:29.161664 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pjxfs" event={"ID":"1389c824-6a9a-4194-a74a-6b85d381a3df","Type":"ContainerStarted","Data":"86fb321d31e0197f0dbf8446465a69c7ad633a6ddeddd0db37b0531efbf95714"} Oct 10 07:59:31 crc kubenswrapper[4822]: I1010 07:59:31.336648 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:59:31 crc kubenswrapper[4822]: I1010 07:59:31.337464 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:59:31 crc kubenswrapper[4822]: I1010 07:59:31.337543 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 07:59:31 crc kubenswrapper[4822]: I1010 07:59:31.338733 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:59:31 crc kubenswrapper[4822]: I1010 07:59:31.338896 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" gracePeriod=600 Oct 10 07:59:31 crc kubenswrapper[4822]: E1010 07:59:31.471171 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:59:32 crc kubenswrapper[4822]: I1010 07:59:32.196587 4822 generic.go:334] "Generic (PLEG): container finished" podID="1389c824-6a9a-4194-a74a-6b85d381a3df" containerID="14a75ec7486424f6d351c35963558619cc274df17be776d2456eddb0c40729f5" exitCode=0 Oct 10 07:59:32 crc kubenswrapper[4822]: I1010 07:59:32.196625 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pjxfs" event={"ID":"1389c824-6a9a-4194-a74a-6b85d381a3df","Type":"ContainerDied","Data":"14a75ec7486424f6d351c35963558619cc274df17be776d2456eddb0c40729f5"} Oct 10 07:59:32 crc kubenswrapper[4822]: I1010 07:59:32.202683 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" exitCode=0 Oct 10 07:59:32 crc kubenswrapper[4822]: I1010 07:59:32.202739 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4"} Oct 10 07:59:32 crc kubenswrapper[4822]: I1010 07:59:32.202816 4822 scope.go:117] "RemoveContainer" containerID="6be1afa435827691537f13fe12c13c00ff6d1428cd681ee10e85bde839016aa2" Oct 10 07:59:32 crc kubenswrapper[4822]: I1010 07:59:32.203554 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 07:59:32 crc kubenswrapper[4822]: E1010 07:59:32.203901 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.523165 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624109 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-scripts\") pod \"1389c824-6a9a-4194-a74a-6b85d381a3df\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624172 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-combined-ca-bundle\") pod \"1389c824-6a9a-4194-a74a-6b85d381a3df\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624212 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1389c824-6a9a-4194-a74a-6b85d381a3df-etc-machine-id\") pod \"1389c824-6a9a-4194-a74a-6b85d381a3df\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624326 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-config-data\") pod \"1389c824-6a9a-4194-a74a-6b85d381a3df\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624363 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-db-sync-config-data\") pod \"1389c824-6a9a-4194-a74a-6b85d381a3df\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624408 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frp9d\" (UniqueName: \"kubernetes.io/projected/1389c824-6a9a-4194-a74a-6b85d381a3df-kube-api-access-frp9d\") pod \"1389c824-6a9a-4194-a74a-6b85d381a3df\" (UID: \"1389c824-6a9a-4194-a74a-6b85d381a3df\") " Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.624569 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1389c824-6a9a-4194-a74a-6b85d381a3df-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1389c824-6a9a-4194-a74a-6b85d381a3df" (UID: "1389c824-6a9a-4194-a74a-6b85d381a3df"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.625833 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1389c824-6a9a-4194-a74a-6b85d381a3df-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.630183 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-scripts" (OuterVolumeSpecName: "scripts") pod "1389c824-6a9a-4194-a74a-6b85d381a3df" (UID: "1389c824-6a9a-4194-a74a-6b85d381a3df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.631797 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1389c824-6a9a-4194-a74a-6b85d381a3df-kube-api-access-frp9d" (OuterVolumeSpecName: "kube-api-access-frp9d") pod "1389c824-6a9a-4194-a74a-6b85d381a3df" (UID: "1389c824-6a9a-4194-a74a-6b85d381a3df"). InnerVolumeSpecName "kube-api-access-frp9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.640081 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1389c824-6a9a-4194-a74a-6b85d381a3df" (UID: "1389c824-6a9a-4194-a74a-6b85d381a3df"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.668069 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1389c824-6a9a-4194-a74a-6b85d381a3df" (UID: "1389c824-6a9a-4194-a74a-6b85d381a3df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.681570 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-config-data" (OuterVolumeSpecName: "config-data") pod "1389c824-6a9a-4194-a74a-6b85d381a3df" (UID: "1389c824-6a9a-4194-a74a-6b85d381a3df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.728238 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.728288 4822 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.728304 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frp9d\" (UniqueName: \"kubernetes.io/projected/1389c824-6a9a-4194-a74a-6b85d381a3df-kube-api-access-frp9d\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.728318 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:33 crc kubenswrapper[4822]: I1010 07:59:33.728336 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1389c824-6a9a-4194-a74a-6b85d381a3df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.235853 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pjxfs" event={"ID":"1389c824-6a9a-4194-a74a-6b85d381a3df","Type":"ContainerDied","Data":"86fb321d31e0197f0dbf8446465a69c7ad633a6ddeddd0db37b0531efbf95714"} Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.235899 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86fb321d31e0197f0dbf8446465a69c7ad633a6ddeddd0db37b0531efbf95714" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.236008 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pjxfs" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.588107 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c4f97f5f5-tkv86"] Oct 10 07:59:34 crc kubenswrapper[4822]: E1010 07:59:34.603658 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389c824-6a9a-4194-a74a-6b85d381a3df" containerName="cinder-db-sync" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.603965 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389c824-6a9a-4194-a74a-6b85d381a3df" containerName="cinder-db-sync" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.610430 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1389c824-6a9a-4194-a74a-6b85d381a3df" containerName="cinder-db-sync" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.613462 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.694018 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4f97f5f5-tkv86"] Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.717358 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.719003 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.724631 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s2sdf" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.725172 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.725222 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.725273 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.727459 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.753564 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-config\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.754211 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.754425 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2g44\" (UniqueName: \"kubernetes.io/projected/7cd06439-9c72-4f40-b9b5-326e11904d84-kube-api-access-p2g44\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.754527 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.754644 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-dns-svc\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.856656 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10da547f-365f-4f9b-933b-8951adfd60b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.856941 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-config\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.857072 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.857193 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.857301 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/10da547f-365f-4f9b-933b-8951adfd60b3-kube-api-access-hz7km\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.857494 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-scripts\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.857602 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10da547f-365f-4f9b-933b-8951adfd60b3-logs\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.857712 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2g44\" (UniqueName: \"kubernetes.io/projected/7cd06439-9c72-4f40-b9b5-326e11904d84-kube-api-access-p2g44\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858081 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-config\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858103 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858243 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858304 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858312 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-dns-svc\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858418 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.858864 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-dns-svc\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.859192 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.881178 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2g44\" (UniqueName: \"kubernetes.io/projected/7cd06439-9c72-4f40-b9b5-326e11904d84-kube-api-access-p2g44\") pod \"dnsmasq-dns-5c4f97f5f5-tkv86\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.949707 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.959864 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/10da547f-365f-4f9b-933b-8951adfd60b3-kube-api-access-hz7km\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.960267 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-scripts\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.960331 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10da547f-365f-4f9b-933b-8951adfd60b3-logs\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.960470 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.960516 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.960583 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10da547f-365f-4f9b-933b-8951adfd60b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.960859 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10da547f-365f-4f9b-933b-8951adfd60b3-logs\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.961430 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.963502 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10da547f-365f-4f9b-933b-8951adfd60b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.964865 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-scripts\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.964930 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.966195 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.966349 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:34 crc kubenswrapper[4822]: I1010 07:59:34.985243 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/10da547f-365f-4f9b-933b-8951adfd60b3-kube-api-access-hz7km\") pod \"cinder-api-0\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " pod="openstack/cinder-api-0" Oct 10 07:59:35 crc kubenswrapper[4822]: I1010 07:59:35.036362 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:59:35 crc kubenswrapper[4822]: I1010 07:59:35.625398 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:59:35 crc kubenswrapper[4822]: W1010 07:59:35.628560 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10da547f_365f_4f9b_933b_8951adfd60b3.slice/crio-970a8e4b128fc4ca1a529413eeba4efab3a88643e390967334fa055d26639ff7 WatchSource:0}: Error finding container 970a8e4b128fc4ca1a529413eeba4efab3a88643e390967334fa055d26639ff7: Status 404 returned error can't find the container with id 970a8e4b128fc4ca1a529413eeba4efab3a88643e390967334fa055d26639ff7 Oct 10 07:59:35 crc kubenswrapper[4822]: I1010 07:59:35.739926 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4f97f5f5-tkv86"] Oct 10 07:59:35 crc kubenswrapper[4822]: W1010 07:59:35.751115 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd06439_9c72_4f40_b9b5_326e11904d84.slice/crio-e58a5c948a868ec63a6a8cce711e876d6a2889f3b4aaacf6b0831b658ae27e0e WatchSource:0}: Error finding container e58a5c948a868ec63a6a8cce711e876d6a2889f3b4aaacf6b0831b658ae27e0e: Status 404 returned error can't find the container with id e58a5c948a868ec63a6a8cce711e876d6a2889f3b4aaacf6b0831b658ae27e0e Oct 10 07:59:36 crc kubenswrapper[4822]: I1010 07:59:36.295238 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"10da547f-365f-4f9b-933b-8951adfd60b3","Type":"ContainerStarted","Data":"7938c47c7fac6492f36e4367f662d924b3a32cba7c26a0ce7ee3cd4018a472d0"} Oct 10 07:59:36 crc kubenswrapper[4822]: I1010 07:59:36.296205 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"10da547f-365f-4f9b-933b-8951adfd60b3","Type":"ContainerStarted","Data":"970a8e4b128fc4ca1a529413eeba4efab3a88643e390967334fa055d26639ff7"} Oct 10 07:59:36 crc kubenswrapper[4822]: I1010 07:59:36.298088 4822 generic.go:334] "Generic (PLEG): container finished" podID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerID="660ef52da5c705a0adeee9720ec10ce6491567a54816055e67188d0cf8d57cb0" exitCode=0 Oct 10 07:59:36 crc kubenswrapper[4822]: I1010 07:59:36.298128 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" event={"ID":"7cd06439-9c72-4f40-b9b5-326e11904d84","Type":"ContainerDied","Data":"660ef52da5c705a0adeee9720ec10ce6491567a54816055e67188d0cf8d57cb0"} Oct 10 07:59:36 crc kubenswrapper[4822]: I1010 07:59:36.298183 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" event={"ID":"7cd06439-9c72-4f40-b9b5-326e11904d84","Type":"ContainerStarted","Data":"e58a5c948a868ec63a6a8cce711e876d6a2889f3b4aaacf6b0831b658ae27e0e"} Oct 10 07:59:37 crc kubenswrapper[4822]: I1010 07:59:37.308349 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" event={"ID":"7cd06439-9c72-4f40-b9b5-326e11904d84","Type":"ContainerStarted","Data":"63783dd67353e256011aa2a561c080cd64dc8b4910149f251c16ef327e54280a"} Oct 10 07:59:37 crc kubenswrapper[4822]: I1010 07:59:37.308878 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:37 crc kubenswrapper[4822]: I1010 07:59:37.310316 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"10da547f-365f-4f9b-933b-8951adfd60b3","Type":"ContainerStarted","Data":"d058dc521bd18cd761eaa9bed847e3e0939fd9bf6ba89fb841a94bd4c73d2223"} Oct 10 07:59:37 crc kubenswrapper[4822]: I1010 07:59:37.310735 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 07:59:37 crc kubenswrapper[4822]: I1010 07:59:37.328839 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" podStartSLOduration=3.328773854 podStartE2EDuration="3.328773854s" podCreationTimestamp="2025-10-10 07:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:37.324770458 +0000 UTC m=+5724.419928654" watchObservedRunningTime="2025-10-10 07:59:37.328773854 +0000 UTC m=+5724.423932090" Oct 10 07:59:37 crc kubenswrapper[4822]: I1010 07:59:37.348189 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.348165613 podStartE2EDuration="3.348165613s" podCreationTimestamp="2025-10-10 07:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:37.342927432 +0000 UTC m=+5724.438085638" watchObservedRunningTime="2025-10-10 07:59:37.348165613 +0000 UTC m=+5724.443323829" Oct 10 07:59:43 crc kubenswrapper[4822]: I1010 07:59:43.660130 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 07:59:43 crc kubenswrapper[4822]: E1010 07:59:43.661122 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:59:44 crc kubenswrapper[4822]: I1010 07:59:44.951122 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.036796 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb"] Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.037167 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerName="dnsmasq-dns" containerID="cri-o://90fc98cecdbfb86d797ef58147f771673badb718abd68c71aac864d0390b6164" gracePeriod=10 Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.418088 4822 generic.go:334] "Generic (PLEG): container finished" podID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerID="90fc98cecdbfb86d797ef58147f771673badb718abd68c71aac864d0390b6164" exitCode=0 Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.418337 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" event={"ID":"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3","Type":"ContainerDied","Data":"90fc98cecdbfb86d797ef58147f771673badb718abd68c71aac864d0390b6164"} Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.547922 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.670352 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fgxg\" (UniqueName: \"kubernetes.io/projected/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-kube-api-access-8fgxg\") pod \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.670394 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-sb\") pod \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.670433 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-config\") pod \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.670564 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-dns-svc\") pod \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.670599 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-nb\") pod \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\" (UID: \"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3\") " Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.677048 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-kube-api-access-8fgxg" (OuterVolumeSpecName: "kube-api-access-8fgxg") pod "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" (UID: "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3"). InnerVolumeSpecName "kube-api-access-8fgxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.724928 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" (UID: "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.729066 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" (UID: "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.739711 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" (UID: "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.745499 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-config" (OuterVolumeSpecName: "config") pod "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" (UID: "ed464cd5-17cb-42e1-811f-ffa7ab6b33f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.773227 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.773266 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.773279 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fgxg\" (UniqueName: \"kubernetes.io/projected/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-kube-api-access-8fgxg\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.773288 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:45 crc kubenswrapper[4822]: I1010 07:59:45.773297 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:46 crc kubenswrapper[4822]: I1010 07:59:46.428509 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" event={"ID":"ed464cd5-17cb-42e1-811f-ffa7ab6b33f3","Type":"ContainerDied","Data":"222cfb0fdaea8dad275c2a98b5d854775b2209f291739121fc6edc0f65c05334"} Oct 10 07:59:46 crc kubenswrapper[4822]: I1010 07:59:46.430013 4822 scope.go:117] "RemoveContainer" containerID="90fc98cecdbfb86d797ef58147f771673badb718abd68c71aac864d0390b6164" Oct 10 07:59:46 crc kubenswrapper[4822]: I1010 07:59:46.428557 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb" Oct 10 07:59:46 crc kubenswrapper[4822]: I1010 07:59:46.467229 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb"] Oct 10 07:59:46 crc kubenswrapper[4822]: I1010 07:59:46.471672 4822 scope.go:117] "RemoveContainer" containerID="3d45c80b386d7e2dfccf3e253852025b59c7c8c27e29ccdac0ad814a553aae13" Oct 10 07:59:46 crc kubenswrapper[4822]: I1010 07:59:46.477184 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cf8bfcd7c-kc8cb"] Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.098952 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.099299 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-log" containerID="cri-o://11f62a5cb5fcd6484bb29ef93a60cc272a3e044520425a81becb5bfff795404d" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.099894 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-metadata" containerID="cri-o://7b797a260d5e5d69497019659f7dc12bf7a7bfbd5eabd8ee7463750fbac87bd0" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.111695 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.112128 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-log" containerID="cri-o://78c86f0f15a74965de4c06ebe75c6115730467e5b9b70b39f0a30d9944285bbf" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.112374 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-api" containerID="cri-o://4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.118836 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.119153 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="84f37f7a-3fb2-4fca-b877-57c6038e176b" containerName="nova-scheduler-scheduler" containerID="cri-o://4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.131882 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.132158 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="f619e646-b85d-40a4-bb47-67db89884281" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.141150 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.141588 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8b82e92e-a46c-4015-9455-ee5319632827" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41" gracePeriod=30 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.148615 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.438733 4822 generic.go:334] "Generic (PLEG): container finished" podID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerID="78c86f0f15a74965de4c06ebe75c6115730467e5b9b70b39f0a30d9944285bbf" exitCode=143 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.438793 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7dc003fc-2bcf-4485-8024-62783b2a6e83","Type":"ContainerDied","Data":"78c86f0f15a74965de4c06ebe75c6115730467e5b9b70b39f0a30d9944285bbf"} Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.446148 4822 generic.go:334] "Generic (PLEG): container finished" podID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerID="11f62a5cb5fcd6484bb29ef93a60cc272a3e044520425a81becb5bfff795404d" exitCode=143 Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.446207 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef4f8712-93f7-4440-a8f2-6407a49ea48d","Type":"ContainerDied","Data":"11f62a5cb5fcd6484bb29ef93a60cc272a3e044520425a81becb5bfff795404d"} Oct 10 07:59:47 crc kubenswrapper[4822]: I1010 07:59:47.661614 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" path="/var/lib/kubelet/pods/ed464cd5-17cb-42e1-811f-ffa7ab6b33f3/volumes" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.043986 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.085232 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.093544 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.095267 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.095325 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="84f37f7a-3fb2-4fca-b877-57c6038e176b" containerName="nova-scheduler-scheduler" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.150199 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-config-data\") pod \"8b82e92e-a46c-4015-9455-ee5319632827\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.150309 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-combined-ca-bundle\") pod \"8b82e92e-a46c-4015-9455-ee5319632827\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.150385 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9lj7\" (UniqueName: \"kubernetes.io/projected/8b82e92e-a46c-4015-9455-ee5319632827-kube-api-access-b9lj7\") pod \"8b82e92e-a46c-4015-9455-ee5319632827\" (UID: \"8b82e92e-a46c-4015-9455-ee5319632827\") " Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.163987 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b82e92e-a46c-4015-9455-ee5319632827-kube-api-access-b9lj7" (OuterVolumeSpecName: "kube-api-access-b9lj7") pod "8b82e92e-a46c-4015-9455-ee5319632827" (UID: "8b82e92e-a46c-4015-9455-ee5319632827"). InnerVolumeSpecName "kube-api-access-b9lj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.179054 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-config-data" (OuterVolumeSpecName: "config-data") pod "8b82e92e-a46c-4015-9455-ee5319632827" (UID: "8b82e92e-a46c-4015-9455-ee5319632827"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.179519 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b82e92e-a46c-4015-9455-ee5319632827" (UID: "8b82e92e-a46c-4015-9455-ee5319632827"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.254342 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.254394 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9lj7\" (UniqueName: \"kubernetes.io/projected/8b82e92e-a46c-4015-9455-ee5319632827-kube-api-access-b9lj7\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.254414 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b82e92e-a46c-4015-9455-ee5319632827-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.338363 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.340086 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.341350 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.341460 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="f619e646-b85d-40a4-bb47-67db89884281" containerName="nova-cell0-conductor-conductor" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.462057 4822 generic.go:334] "Generic (PLEG): container finished" podID="8b82e92e-a46c-4015-9455-ee5319632827" containerID="c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41" exitCode=0 Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.462106 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b82e92e-a46c-4015-9455-ee5319632827","Type":"ContainerDied","Data":"c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41"} Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.462137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b82e92e-a46c-4015-9455-ee5319632827","Type":"ContainerDied","Data":"1db359ffeead417531fa60628c2e4f7a2d39dc010b3f83830cd163ee6fe101b3"} Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.462156 4822 scope.go:117] "RemoveContainer" containerID="c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.462153 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.506415 4822 scope.go:117] "RemoveContainer" containerID="c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41" Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.507014 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41\": container with ID starting with c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41 not found: ID does not exist" containerID="c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.507068 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41"} err="failed to get container status \"c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41\": rpc error: code = NotFound desc = could not find container \"c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41\": container with ID starting with c912b779f4d80c788951146379fc5a0a5d9dc49826456d60d54a534b6ad83d41 not found: ID does not exist" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.516882 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.529912 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.541113 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.541741 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerName="dnsmasq-dns" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.541766 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerName="dnsmasq-dns" Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.541793 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerName="init" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.541869 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerName="init" Oct 10 07:59:48 crc kubenswrapper[4822]: E1010 07:59:48.541890 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b82e92e-a46c-4015-9455-ee5319632827" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.541899 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b82e92e-a46c-4015-9455-ee5319632827" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.542085 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed464cd5-17cb-42e1-811f-ffa7ab6b33f3" containerName="dnsmasq-dns" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.542118 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b82e92e-a46c-4015-9455-ee5319632827" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.543122 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.569079 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.570311 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e555555-d99c-4a6f-af86-74539d7163e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.570500 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e555555-d99c-4a6f-af86-74539d7163e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.570687 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk24s\" (UniqueName: \"kubernetes.io/projected/7e555555-d99c-4a6f-af86-74539d7163e4-kube-api-access-qk24s\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.571944 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.672987 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e555555-d99c-4a6f-af86-74539d7163e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.673389 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk24s\" (UniqueName: \"kubernetes.io/projected/7e555555-d99c-4a6f-af86-74539d7163e4-kube-api-access-qk24s\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.673526 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e555555-d99c-4a6f-af86-74539d7163e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.687465 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e555555-d99c-4a6f-af86-74539d7163e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.690065 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk24s\" (UniqueName: \"kubernetes.io/projected/7e555555-d99c-4a6f-af86-74539d7163e4-kube-api-access-qk24s\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.691251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e555555-d99c-4a6f-af86-74539d7163e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e555555-d99c-4a6f-af86-74539d7163e4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:48 crc kubenswrapper[4822]: I1010 07:59:48.875121 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:49 crc kubenswrapper[4822]: I1010 07:59:49.366461 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:59:49 crc kubenswrapper[4822]: W1010 07:59:49.371357 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e555555_d99c_4a6f_af86_74539d7163e4.slice/crio-cd139889d4639232c7c9cd952b61a0d0d473bf98d2c8c11b3308a766b777a51d WatchSource:0}: Error finding container cd139889d4639232c7c9cd952b61a0d0d473bf98d2c8c11b3308a766b777a51d: Status 404 returned error can't find the container with id cd139889d4639232c7c9cd952b61a0d0d473bf98d2c8c11b3308a766b777a51d Oct 10 07:59:49 crc kubenswrapper[4822]: I1010 07:59:49.473465 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7e555555-d99c-4a6f-af86-74539d7163e4","Type":"ContainerStarted","Data":"cd139889d4639232c7c9cd952b61a0d0d473bf98d2c8c11b3308a766b777a51d"} Oct 10 07:59:49 crc kubenswrapper[4822]: I1010 07:59:49.699139 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b82e92e-a46c-4015-9455-ee5319632827" path="/var/lib/kubelet/pods/8b82e92e-a46c-4015-9455-ee5319632827/volumes" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.251704 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": read tcp 10.217.0.2:43214->10.217.1.75:8775: read: connection reset by peer" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.251746 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": read tcp 10.217.0.2:43206->10.217.1.75:8775: read: connection reset by peer" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.332753 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.332985 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="53e314d1-6335-4c39-a8a2-d164cdd11d9a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd" gracePeriod=30 Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.514003 4822 generic.go:334] "Generic (PLEG): container finished" podID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerID="7b797a260d5e5d69497019659f7dc12bf7a7bfbd5eabd8ee7463750fbac87bd0" exitCode=0 Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.514123 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef4f8712-93f7-4440-a8f2-6407a49ea48d","Type":"ContainerDied","Data":"7b797a260d5e5d69497019659f7dc12bf7a7bfbd5eabd8ee7463750fbac87bd0"} Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.517142 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7e555555-d99c-4a6f-af86-74539d7163e4","Type":"ContainerStarted","Data":"3a1d7c00d2dbb9da4d73b9898438047182ce42c8b42f08e3b25a56b9bd362ec3"} Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.520959 4822 generic.go:334] "Generic (PLEG): container finished" podID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerID="4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44" exitCode=0 Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.520995 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7dc003fc-2bcf-4485-8024-62783b2a6e83","Type":"ContainerDied","Data":"4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44"} Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.539586 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.539567023 podStartE2EDuration="2.539567023s" podCreationTimestamp="2025-10-10 07:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:50.535412483 +0000 UTC m=+5737.630570679" watchObservedRunningTime="2025-10-10 07:59:50.539567023 +0000 UTC m=+5737.634725219" Oct 10 07:59:50 crc kubenswrapper[4822]: E1010 07:59:50.570338 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc003fc_2bcf_4485_8024_62783b2a6e83.slice/crio-4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc003fc_2bcf_4485_8024_62783b2a6e83.slice/crio-conmon-4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.860072 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.865427 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919228 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89xrs\" (UniqueName: \"kubernetes.io/projected/7dc003fc-2bcf-4485-8024-62783b2a6e83-kube-api-access-89xrs\") pod \"7dc003fc-2bcf-4485-8024-62783b2a6e83\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919483 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-combined-ca-bundle\") pod \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919558 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4f8712-93f7-4440-a8f2-6407a49ea48d-logs\") pod \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919621 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-combined-ca-bundle\") pod \"7dc003fc-2bcf-4485-8024-62783b2a6e83\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919703 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-config-data\") pod \"7dc003fc-2bcf-4485-8024-62783b2a6e83\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919766 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dc003fc-2bcf-4485-8024-62783b2a6e83-logs\") pod \"7dc003fc-2bcf-4485-8024-62783b2a6e83\" (UID: \"7dc003fc-2bcf-4485-8024-62783b2a6e83\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919872 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96m65\" (UniqueName: \"kubernetes.io/projected/ef4f8712-93f7-4440-a8f2-6407a49ea48d-kube-api-access-96m65\") pod \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.919961 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-config-data\") pod \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\" (UID: \"ef4f8712-93f7-4440-a8f2-6407a49ea48d\") " Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.921587 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc003fc-2bcf-4485-8024-62783b2a6e83-logs" (OuterVolumeSpecName: "logs") pod "7dc003fc-2bcf-4485-8024-62783b2a6e83" (UID: "7dc003fc-2bcf-4485-8024-62783b2a6e83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.922116 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4f8712-93f7-4440-a8f2-6407a49ea48d-logs" (OuterVolumeSpecName: "logs") pod "ef4f8712-93f7-4440-a8f2-6407a49ea48d" (UID: "ef4f8712-93f7-4440-a8f2-6407a49ea48d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.965865 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc003fc-2bcf-4485-8024-62783b2a6e83-kube-api-access-89xrs" (OuterVolumeSpecName: "kube-api-access-89xrs") pod "7dc003fc-2bcf-4485-8024-62783b2a6e83" (UID: "7dc003fc-2bcf-4485-8024-62783b2a6e83"). InnerVolumeSpecName "kube-api-access-89xrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.967968 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4f8712-93f7-4440-a8f2-6407a49ea48d-kube-api-access-96m65" (OuterVolumeSpecName: "kube-api-access-96m65") pod "ef4f8712-93f7-4440-a8f2-6407a49ea48d" (UID: "ef4f8712-93f7-4440-a8f2-6407a49ea48d"). InnerVolumeSpecName "kube-api-access-96m65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.972199 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-config-data" (OuterVolumeSpecName: "config-data") pod "7dc003fc-2bcf-4485-8024-62783b2a6e83" (UID: "7dc003fc-2bcf-4485-8024-62783b2a6e83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.977446 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-config-data" (OuterVolumeSpecName: "config-data") pod "ef4f8712-93f7-4440-a8f2-6407a49ea48d" (UID: "ef4f8712-93f7-4440-a8f2-6407a49ea48d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.987066 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef4f8712-93f7-4440-a8f2-6407a49ea48d" (UID: "ef4f8712-93f7-4440-a8f2-6407a49ea48d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:50 crc kubenswrapper[4822]: I1010 07:59:50.998997 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dc003fc-2bcf-4485-8024-62783b2a6e83" (UID: "7dc003fc-2bcf-4485-8024-62783b2a6e83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022507 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89xrs\" (UniqueName: \"kubernetes.io/projected/7dc003fc-2bcf-4485-8024-62783b2a6e83-kube-api-access-89xrs\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022547 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022557 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4f8712-93f7-4440-a8f2-6407a49ea48d-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022565 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022575 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc003fc-2bcf-4485-8024-62783b2a6e83-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022585 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dc003fc-2bcf-4485-8024-62783b2a6e83-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022593 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96m65\" (UniqueName: \"kubernetes.io/projected/ef4f8712-93f7-4440-a8f2-6407a49ea48d-kube-api-access-96m65\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.022602 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4f8712-93f7-4440-a8f2-6407a49ea48d-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.535272 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7dc003fc-2bcf-4485-8024-62783b2a6e83","Type":"ContainerDied","Data":"1ab5c3409a19bde3714b866f98a3572245fcc14bc8e937f4b1fbf641fc2873bc"} Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.535330 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.535339 4822 scope.go:117] "RemoveContainer" containerID="4889eb988241586dce69acb9f9c26fd1defd86117cd2b71657284c85a9f63c44" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.542728 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef4f8712-93f7-4440-a8f2-6407a49ea48d","Type":"ContainerDied","Data":"cfa5bf038fb3d2df92cd63ba013dad65178837fa031b0b9dc1b2f5c4a2a0d595"} Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.542758 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.633667 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.635024 4822 scope.go:117] "RemoveContainer" containerID="78c86f0f15a74965de4c06ebe75c6115730467e5b9b70b39f0a30d9944285bbf" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.643836 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.671097 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" path="/var/lib/kubelet/pods/ef4f8712-93f7-4440-a8f2-6407a49ea48d/volumes" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.671660 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.689932 4822 scope.go:117] "RemoveContainer" containerID="7b797a260d5e5d69497019659f7dc12bf7a7bfbd5eabd8ee7463750fbac87bd0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.702224 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: E1010 07:59:51.702775 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-api" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.702795 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-api" Oct 10 07:59:51 crc kubenswrapper[4822]: E1010 07:59:51.702862 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-log" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.702871 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-log" Oct 10 07:59:51 crc kubenswrapper[4822]: E1010 07:59:51.702888 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-log" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.702896 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-log" Oct 10 07:59:51 crc kubenswrapper[4822]: E1010 07:59:51.702927 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-metadata" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.702935 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-metadata" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.703150 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-log" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.703174 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-log" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.703184 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" containerName="nova-api-api" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.703208 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4f8712-93f7-4440-a8f2-6407a49ea48d" containerName="nova-metadata-metadata" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.704473 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.706922 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.730084 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.736378 4822 scope.go:117] "RemoveContainer" containerID="11f62a5cb5fcd6484bb29ef93a60cc272a3e044520425a81becb5bfff795404d" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.736787 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-config-data\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.740045 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.740172 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z9ck\" (UniqueName: \"kubernetes.io/projected/e146fc70-ffc2-40af-b67c-f636fa7019b6-kube-api-access-9z9ck\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.740223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e146fc70-ffc2-40af-b67c-f636fa7019b6-logs\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.774630 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.811570 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.813536 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.822632 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.824317 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842093 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011a2ccc-0472-41e2-bf43-ecd546f26e67-logs\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842379 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z9ck\" (UniqueName: \"kubernetes.io/projected/e146fc70-ffc2-40af-b67c-f636fa7019b6-kube-api-access-9z9ck\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e146fc70-ffc2-40af-b67c-f636fa7019b6-logs\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842608 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-config-data\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842697 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpsgj\" (UniqueName: \"kubernetes.io/projected/011a2ccc-0472-41e2-bf43-ecd546f26e67-kube-api-access-jpsgj\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842786 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-config-data\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.842888 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e146fc70-ffc2-40af-b67c-f636fa7019b6-logs\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.843000 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.843100 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.849083 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-config-data\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.868471 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.869494 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z9ck\" (UniqueName: \"kubernetes.io/projected/e146fc70-ffc2-40af-b67c-f636fa7019b6-kube-api-access-9z9ck\") pod \"nova-metadata-0\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " pod="openstack/nova-metadata-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.937006 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.947778 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-config-data\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.948447 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpsgj\" (UniqueName: \"kubernetes.io/projected/011a2ccc-0472-41e2-bf43-ecd546f26e67-kube-api-access-jpsgj\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.950874 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.951918 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011a2ccc-0472-41e2-bf43-ecd546f26e67-logs\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.952792 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.966271 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011a2ccc-0472-41e2-bf43-ecd546f26e67-logs\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.967331 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-config-data\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:51 crc kubenswrapper[4822]: I1010 07:59:51.970755 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpsgj\" (UniqueName: \"kubernetes.io/projected/011a2ccc-0472-41e2-bf43-ecd546f26e67-kube-api-access-jpsgj\") pod \"nova-api-0\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " pod="openstack/nova-api-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.053514 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-combined-ca-bundle\") pod \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.057025 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfcwf\" (UniqueName: \"kubernetes.io/projected/53e314d1-6335-4c39-a8a2-d164cdd11d9a-kube-api-access-pfcwf\") pod \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.057257 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-config-data\") pod \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\" (UID: \"53e314d1-6335-4c39-a8a2-d164cdd11d9a\") " Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.063119 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e314d1-6335-4c39-a8a2-d164cdd11d9a-kube-api-access-pfcwf" (OuterVolumeSpecName: "kube-api-access-pfcwf") pod "53e314d1-6335-4c39-a8a2-d164cdd11d9a" (UID: "53e314d1-6335-4c39-a8a2-d164cdd11d9a"). InnerVolumeSpecName "kube-api-access-pfcwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.087974 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53e314d1-6335-4c39-a8a2-d164cdd11d9a" (UID: "53e314d1-6335-4c39-a8a2-d164cdd11d9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.092909 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-config-data" (OuterVolumeSpecName: "config-data") pod "53e314d1-6335-4c39-a8a2-d164cdd11d9a" (UID: "53e314d1-6335-4c39-a8a2-d164cdd11d9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.113596 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.115473 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.136895 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.162650 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnn9k\" (UniqueName: \"kubernetes.io/projected/84f37f7a-3fb2-4fca-b877-57c6038e176b-kube-api-access-hnn9k\") pod \"84f37f7a-3fb2-4fca-b877-57c6038e176b\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.162857 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-config-data\") pod \"84f37f7a-3fb2-4fca-b877-57c6038e176b\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.162910 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-combined-ca-bundle\") pod \"84f37f7a-3fb2-4fca-b877-57c6038e176b\" (UID: \"84f37f7a-3fb2-4fca-b877-57c6038e176b\") " Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.163543 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.163565 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e314d1-6335-4c39-a8a2-d164cdd11d9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.163578 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfcwf\" (UniqueName: \"kubernetes.io/projected/53e314d1-6335-4c39-a8a2-d164cdd11d9a-kube-api-access-pfcwf\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.181683 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f37f7a-3fb2-4fca-b877-57c6038e176b-kube-api-access-hnn9k" (OuterVolumeSpecName: "kube-api-access-hnn9k") pod "84f37f7a-3fb2-4fca-b877-57c6038e176b" (UID: "84f37f7a-3fb2-4fca-b877-57c6038e176b"). InnerVolumeSpecName "kube-api-access-hnn9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.192669 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84f37f7a-3fb2-4fca-b877-57c6038e176b" (UID: "84f37f7a-3fb2-4fca-b877-57c6038e176b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.195629 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-config-data" (OuterVolumeSpecName: "config-data") pod "84f37f7a-3fb2-4fca-b877-57c6038e176b" (UID: "84f37f7a-3fb2-4fca-b877-57c6038e176b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.265302 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnn9k\" (UniqueName: \"kubernetes.io/projected/84f37f7a-3fb2-4fca-b877-57c6038e176b-kube-api-access-hnn9k\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.265350 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.265365 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84f37f7a-3fb2-4fca-b877-57c6038e176b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.565232 4822 generic.go:334] "Generic (PLEG): container finished" podID="53e314d1-6335-4c39-a8a2-d164cdd11d9a" containerID="f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd" exitCode=0 Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.565295 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.565304 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"53e314d1-6335-4c39-a8a2-d164cdd11d9a","Type":"ContainerDied","Data":"f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd"} Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.565831 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"53e314d1-6335-4c39-a8a2-d164cdd11d9a","Type":"ContainerDied","Data":"89253db5c204baf9502c2e71b827139bc2668d5be7e9955cc9208ab05363290a"} Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.565861 4822 scope.go:117] "RemoveContainer" containerID="f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.570793 4822 generic.go:334] "Generic (PLEG): container finished" podID="84f37f7a-3fb2-4fca-b877-57c6038e176b" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" exitCode=0 Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.570851 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84f37f7a-3fb2-4fca-b877-57c6038e176b","Type":"ContainerDied","Data":"4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c"} Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.570876 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84f37f7a-3fb2-4fca-b877-57c6038e176b","Type":"ContainerDied","Data":"57e8c36a12d1dae5805f3201a8a662d04935f176c3847a071af2b4c410d8b0ef"} Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.570921 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.603023 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.603971 4822 scope.go:117] "RemoveContainer" containerID="f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd" Oct 10 07:59:52 crc kubenswrapper[4822]: E1010 07:59:52.606938 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd\": container with ID starting with f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd not found: ID does not exist" containerID="f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.606997 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd"} err="failed to get container status \"f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd\": rpc error: code = NotFound desc = could not find container \"f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd\": container with ID starting with f44452e638b6dfcc1ca8b49647052c9694c19595daf3ae03d701be47b3b9cebd not found: ID does not exist" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.607052 4822 scope.go:117] "RemoveContainer" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.620847 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.638639 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.642107 4822 scope.go:117] "RemoveContainer" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" Oct 10 07:59:52 crc kubenswrapper[4822]: E1010 07:59:52.642582 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c\": container with ID starting with 4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c not found: ID does not exist" containerID="4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.642617 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c"} err="failed to get container status \"4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c\": rpc error: code = NotFound desc = could not find container \"4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c\": container with ID starting with 4ec4fd7b766ae5abdf44971a3dca84cfc6b35912483cfd59394255fe5a2b910c not found: ID does not exist" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.653942 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.674866 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: E1010 07:59:52.675409 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f37f7a-3fb2-4fca-b877-57c6038e176b" containerName="nova-scheduler-scheduler" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.675430 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f37f7a-3fb2-4fca-b877-57c6038e176b" containerName="nova-scheduler-scheduler" Oct 10 07:59:52 crc kubenswrapper[4822]: E1010 07:59:52.675466 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e314d1-6335-4c39-a8a2-d164cdd11d9a" containerName="nova-cell1-conductor-conductor" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.675475 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e314d1-6335-4c39-a8a2-d164cdd11d9a" containerName="nova-cell1-conductor-conductor" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.675714 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e314d1-6335-4c39-a8a2-d164cdd11d9a" containerName="nova-cell1-conductor-conductor" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.675733 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f37f7a-3fb2-4fca-b877-57c6038e176b" containerName="nova-scheduler-scheduler" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.676631 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.681015 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.689504 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.701336 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.703100 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.707322 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:59:52 crc kubenswrapper[4822]: W1010 07:59:52.724655 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode146fc70_ffc2_40af_b67c_f636fa7019b6.slice/crio-abf571fff4416e28175de14e4a4d0a92ad430d8b7dbd4dce0ac91197a4890528 WatchSource:0}: Error finding container abf571fff4416e28175de14e4a4d0a92ad430d8b7dbd4dce0ac91197a4890528: Status 404 returned error can't find the container with id abf571fff4416e28175de14e4a4d0a92ad430d8b7dbd4dce0ac91197a4890528 Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.726145 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.747316 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.767797 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.777944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.778027 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.778075 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.778195 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-config-data\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.778335 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvkw\" (UniqueName: \"kubernetes.io/projected/09ade431-bbe8-404b-a690-4e1eb2c542f9-kube-api-access-wkvkw\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.778370 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpm7g\" (UniqueName: \"kubernetes.io/projected/006a06c2-ba4e-4aea-a817-73e66bd4720a-kube-api-access-cpm7g\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.880936 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.881406 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.881441 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.881539 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-config-data\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.881780 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvkw\" (UniqueName: \"kubernetes.io/projected/09ade431-bbe8-404b-a690-4e1eb2c542f9-kube-api-access-wkvkw\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.882013 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpm7g\" (UniqueName: \"kubernetes.io/projected/006a06c2-ba4e-4aea-a817-73e66bd4720a-kube-api-access-cpm7g\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.888843 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.889104 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-config-data\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.889397 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.889640 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.902410 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpm7g\" (UniqueName: \"kubernetes.io/projected/006a06c2-ba4e-4aea-a817-73e66bd4720a-kube-api-access-cpm7g\") pod \"nova-scheduler-0\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " pod="openstack/nova-scheduler-0" Oct 10 07:59:52 crc kubenswrapper[4822]: I1010 07:59:52.902984 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvkw\" (UniqueName: \"kubernetes.io/projected/09ade431-bbe8-404b-a690-4e1eb2c542f9-kube-api-access-wkvkw\") pod \"nova-cell1-conductor-0\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.005334 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.060209 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.084301 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-combined-ca-bundle\") pod \"f619e646-b85d-40a4-bb47-67db89884281\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.084401 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjfrq\" (UniqueName: \"kubernetes.io/projected/f619e646-b85d-40a4-bb47-67db89884281-kube-api-access-gjfrq\") pod \"f619e646-b85d-40a4-bb47-67db89884281\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.084498 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-config-data\") pod \"f619e646-b85d-40a4-bb47-67db89884281\" (UID: \"f619e646-b85d-40a4-bb47-67db89884281\") " Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.091251 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f619e646-b85d-40a4-bb47-67db89884281-kube-api-access-gjfrq" (OuterVolumeSpecName: "kube-api-access-gjfrq") pod "f619e646-b85d-40a4-bb47-67db89884281" (UID: "f619e646-b85d-40a4-bb47-67db89884281"). InnerVolumeSpecName "kube-api-access-gjfrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.157675 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f619e646-b85d-40a4-bb47-67db89884281" (UID: "f619e646-b85d-40a4-bb47-67db89884281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.169126 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-config-data" (OuterVolumeSpecName: "config-data") pod "f619e646-b85d-40a4-bb47-67db89884281" (UID: "f619e646-b85d-40a4-bb47-67db89884281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.186593 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.186669 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjfrq\" (UniqueName: \"kubernetes.io/projected/f619e646-b85d-40a4-bb47-67db89884281-kube-api-access-gjfrq\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.186688 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f619e646-b85d-40a4-bb47-67db89884281-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.200745 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.485933 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:59:53 crc kubenswrapper[4822]: W1010 07:59:53.499749 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod006a06c2_ba4e_4aea_a817_73e66bd4720a.slice/crio-168f62632dded33f662023c2b190a077037876ed9f509e3ddfd12865a2892974 WatchSource:0}: Error finding container 168f62632dded33f662023c2b190a077037876ed9f509e3ddfd12865a2892974: Status 404 returned error can't find the container with id 168f62632dded33f662023c2b190a077037876ed9f509e3ddfd12865a2892974 Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.603386 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"011a2ccc-0472-41e2-bf43-ecd546f26e67","Type":"ContainerStarted","Data":"c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.603460 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"011a2ccc-0472-41e2-bf43-ecd546f26e67","Type":"ContainerStarted","Data":"32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.603473 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"011a2ccc-0472-41e2-bf43-ecd546f26e67","Type":"ContainerStarted","Data":"e0c4a2ed9388cce7e362b9275d5828d5c429e588d387eaa85789a860f5d48f9c"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.610854 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e146fc70-ffc2-40af-b67c-f636fa7019b6","Type":"ContainerStarted","Data":"11415ca17046e0e8a0dab09d26116dce4c62acf438787669e4a12a2fc0bb829c"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.610901 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e146fc70-ffc2-40af-b67c-f636fa7019b6","Type":"ContainerStarted","Data":"a5cf0feb59f3bc799a7557ae0334acbfbe76ba7aff7c2d24c3a1350b2948dec9"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.610912 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e146fc70-ffc2-40af-b67c-f636fa7019b6","Type":"ContainerStarted","Data":"abf571fff4416e28175de14e4a4d0a92ad430d8b7dbd4dce0ac91197a4890528"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.612946 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006a06c2-ba4e-4aea-a817-73e66bd4720a","Type":"ContainerStarted","Data":"168f62632dded33f662023c2b190a077037876ed9f509e3ddfd12865a2892974"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.616781 4822 generic.go:334] "Generic (PLEG): container finished" podID="f619e646-b85d-40a4-bb47-67db89884281" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" exitCode=0 Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.616836 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f619e646-b85d-40a4-bb47-67db89884281","Type":"ContainerDied","Data":"5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.616857 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f619e646-b85d-40a4-bb47-67db89884281","Type":"ContainerDied","Data":"470dcd15cccb11fdd8869fb742ba49a7fb2e9e1b2dea130903c7381ecf705482"} Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.616872 4822 scope.go:117] "RemoveContainer" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.616958 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.628472 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.628458122 podStartE2EDuration="2.628458122s" podCreationTimestamp="2025-10-10 07:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:53.626829455 +0000 UTC m=+5740.721987661" watchObservedRunningTime="2025-10-10 07:59:53.628458122 +0000 UTC m=+5740.723616318" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.652154 4822 scope.go:117] "RemoveContainer" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" Oct 10 07:59:53 crc kubenswrapper[4822]: E1010 07:59:53.652737 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24\": container with ID starting with 5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24 not found: ID does not exist" containerID="5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.652775 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24"} err="failed to get container status \"5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24\": rpc error: code = NotFound desc = could not find container \"5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24\": container with ID starting with 5767740d90618c7f2b05b2a539ea65fd6d7f3a35e76149ca13268aeb351d3f24 not found: ID does not exist" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.659964 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.659945269 podStartE2EDuration="2.659945269s" podCreationTimestamp="2025-10-10 07:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:53.656052817 +0000 UTC m=+5740.751211023" watchObservedRunningTime="2025-10-10 07:59:53.659945269 +0000 UTC m=+5740.755103465" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.670875 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e314d1-6335-4c39-a8a2-d164cdd11d9a" path="/var/lib/kubelet/pods/53e314d1-6335-4c39-a8a2-d164cdd11d9a/volumes" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.671585 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc003fc-2bcf-4485-8024-62783b2a6e83" path="/var/lib/kubelet/pods/7dc003fc-2bcf-4485-8024-62783b2a6e83/volumes" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.672197 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f37f7a-3fb2-4fca-b877-57c6038e176b" path="/var/lib/kubelet/pods/84f37f7a-3fb2-4fca-b877-57c6038e176b/volumes" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.691793 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.714839 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.797307 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:59:53 crc kubenswrapper[4822]: E1010 07:59:53.798158 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f619e646-b85d-40a4-bb47-67db89884281" containerName="nova-cell0-conductor-conductor" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.798220 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f619e646-b85d-40a4-bb47-67db89884281" containerName="nova-cell0-conductor-conductor" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.799191 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f619e646-b85d-40a4-bb47-67db89884281" containerName="nova-cell0-conductor-conductor" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.800837 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.803939 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.808694 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.818435 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.876078 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.908055 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.908195 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:53 crc kubenswrapper[4822]: I1010 07:59:53.908369 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x566\" (UniqueName: \"kubernetes.io/projected/82979957-20d2-4f04-8595-6ba826b061d9-kube-api-access-8x566\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.010062 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.010192 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x566\" (UniqueName: \"kubernetes.io/projected/82979957-20d2-4f04-8595-6ba826b061d9-kube-api-access-8x566\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.010265 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.014785 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.015337 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.025904 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x566\" (UniqueName: \"kubernetes.io/projected/82979957-20d2-4f04-8595-6ba826b061d9-kube-api-access-8x566\") pod \"nova-cell0-conductor-0\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.197975 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.631846 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006a06c2-ba4e-4aea-a817-73e66bd4720a","Type":"ContainerStarted","Data":"206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0"} Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.637065 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"09ade431-bbe8-404b-a690-4e1eb2c542f9","Type":"ContainerStarted","Data":"6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752"} Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.637173 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"09ade431-bbe8-404b-a690-4e1eb2c542f9","Type":"ContainerStarted","Data":"584d1c36666076a54220e36ce6faab93019edcd5b16bb7f5fc0ee1ecb2935319"} Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.637212 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.667184 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.667156639 podStartE2EDuration="2.667156639s" podCreationTimestamp="2025-10-10 07:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:54.660890059 +0000 UTC m=+5741.756048315" watchObservedRunningTime="2025-10-10 07:59:54.667156639 +0000 UTC m=+5741.762314895" Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.702022 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:59:54 crc kubenswrapper[4822]: W1010 07:59:54.705518 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82979957_20d2_4f04_8595_6ba826b061d9.slice/crio-62effe3d02f5ef935685212204980b0d2b80fd82b744df022116b133832b9970 WatchSource:0}: Error finding container 62effe3d02f5ef935685212204980b0d2b80fd82b744df022116b133832b9970: Status 404 returned error can't find the container with id 62effe3d02f5ef935685212204980b0d2b80fd82b744df022116b133832b9970 Oct 10 07:59:54 crc kubenswrapper[4822]: I1010 07:59:54.707965 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.707944875 podStartE2EDuration="2.707944875s" podCreationTimestamp="2025-10-10 07:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:54.692445489 +0000 UTC m=+5741.787603725" watchObservedRunningTime="2025-10-10 07:59:54.707944875 +0000 UTC m=+5741.803103091" Oct 10 07:59:55 crc kubenswrapper[4822]: I1010 07:59:55.678674 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f619e646-b85d-40a4-bb47-67db89884281" path="/var/lib/kubelet/pods/f619e646-b85d-40a4-bb47-67db89884281/volumes" Oct 10 07:59:55 crc kubenswrapper[4822]: I1010 07:59:55.681536 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:55 crc kubenswrapper[4822]: I1010 07:59:55.684178 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"82979957-20d2-4f04-8595-6ba826b061d9","Type":"ContainerStarted","Data":"fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8"} Oct 10 07:59:55 crc kubenswrapper[4822]: I1010 07:59:55.684333 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"82979957-20d2-4f04-8595-6ba826b061d9","Type":"ContainerStarted","Data":"62effe3d02f5ef935685212204980b0d2b80fd82b744df022116b133832b9970"} Oct 10 07:59:55 crc kubenswrapper[4822]: I1010 07:59:55.696606 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.696588849 podStartE2EDuration="2.696588849s" podCreationTimestamp="2025-10-10 07:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:59:55.694137678 +0000 UTC m=+5742.789295884" watchObservedRunningTime="2025-10-10 07:59:55.696588849 +0000 UTC m=+5742.791747045" Oct 10 07:59:57 crc kubenswrapper[4822]: I1010 07:59:57.116249 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:59:57 crc kubenswrapper[4822]: I1010 07:59:57.116333 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:59:58 crc kubenswrapper[4822]: I1010 07:59:58.006271 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:59:58 crc kubenswrapper[4822]: I1010 07:59:58.229748 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 07:59:58 crc kubenswrapper[4822]: I1010 07:59:58.650478 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 07:59:58 crc kubenswrapper[4822]: E1010 07:59:58.650752 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 07:59:58 crc kubenswrapper[4822]: I1010 07:59:58.875679 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:58 crc kubenswrapper[4822]: I1010 07:59:58.886236 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:59:59 crc kubenswrapper[4822]: I1010 07:59:59.226690 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 07:59:59 crc kubenswrapper[4822]: I1010 07:59:59.736339 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.149861 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj"] Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.151480 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.153887 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.155357 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.182011 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj"] Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.348870 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-config-volume\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.348916 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-secret-volume\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.348976 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq2lw\" (UniqueName: \"kubernetes.io/projected/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-kube-api-access-bq2lw\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.450871 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-config-volume\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.450929 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-secret-volume\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.450994 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq2lw\" (UniqueName: \"kubernetes.io/projected/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-kube-api-access-bq2lw\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.451774 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-config-volume\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.461249 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-secret-volume\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.467499 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq2lw\" (UniqueName: \"kubernetes.io/projected/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-kube-api-access-bq2lw\") pod \"collect-profiles-29334720-cjcpj\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:00 crc kubenswrapper[4822]: I1010 08:00:00.485531 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:01 crc kubenswrapper[4822]: I1010 08:00:01.055087 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj"] Oct 10 08:00:01 crc kubenswrapper[4822]: I1010 08:00:01.749405 4822 generic.go:334] "Generic (PLEG): container finished" podID="d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" containerID="01c0eb829b912fe4aea4d733c6697631f0b82f1ecec271642cbc6268123c64b1" exitCode=0 Oct 10 08:00:01 crc kubenswrapper[4822]: I1010 08:00:01.749607 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" event={"ID":"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5","Type":"ContainerDied","Data":"01c0eb829b912fe4aea4d733c6697631f0b82f1ecec271642cbc6268123c64b1"} Oct 10 08:00:01 crc kubenswrapper[4822]: I1010 08:00:01.750344 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" event={"ID":"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5","Type":"ContainerStarted","Data":"4ab7c1c7eb7bf4a6a2b59af598cbf6fa177bdcdf1b4a7e2656cad72daeb7ffca"} Oct 10 08:00:02 crc kubenswrapper[4822]: I1010 08:00:02.117642 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:00:02 crc kubenswrapper[4822]: I1010 08:00:02.117716 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:00:02 crc kubenswrapper[4822]: I1010 08:00:02.138242 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:00:02 crc kubenswrapper[4822]: I1010 08:00:02.138320 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.006470 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.049642 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.135212 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.243240 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.85:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.243287 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.85:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.243241 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.243277 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.317830 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-config-volume\") pod \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.317987 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq2lw\" (UniqueName: \"kubernetes.io/projected/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-kube-api-access-bq2lw\") pod \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.318248 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-secret-volume\") pod \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\" (UID: \"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5\") " Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.318871 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-config-volume" (OuterVolumeSpecName: "config-volume") pod "d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" (UID: "d8f8e2c6-0581-44b8-ac9e-2fde66715ab5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.319508 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.325384 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" (UID: "d8f8e2c6-0581-44b8-ac9e-2fde66715ab5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.325489 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-kube-api-access-bq2lw" (OuterVolumeSpecName: "kube-api-access-bq2lw") pod "d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" (UID: "d8f8e2c6-0581-44b8-ac9e-2fde66715ab5"). InnerVolumeSpecName "kube-api-access-bq2lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.421056 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq2lw\" (UniqueName: \"kubernetes.io/projected/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-kube-api-access-bq2lw\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.421094 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.772929 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" event={"ID":"d8f8e2c6-0581-44b8-ac9e-2fde66715ab5","Type":"ContainerDied","Data":"4ab7c1c7eb7bf4a6a2b59af598cbf6fa177bdcdf1b4a7e2656cad72daeb7ffca"} Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.773008 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ab7c1c7eb7bf4a6a2b59af598cbf6fa177bdcdf1b4a7e2656cad72daeb7ffca" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.772952 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj" Oct 10 08:00:03 crc kubenswrapper[4822]: I1010 08:00:03.805760 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 08:00:04 crc kubenswrapper[4822]: I1010 08:00:04.212651 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr"] Oct 10 08:00:04 crc kubenswrapper[4822]: I1010 08:00:04.224155 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-78jhr"] Oct 10 08:00:05 crc kubenswrapper[4822]: I1010 08:00:05.671357 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9dc9779-cb71-4877-9f3a-88ab1a805bb2" path="/var/lib/kubelet/pods/e9dc9779-cb71-4877-9f3a-88ab1a805bb2/volumes" Oct 10 08:00:06 crc kubenswrapper[4822]: I1010 08:00:06.969746 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:06 crc kubenswrapper[4822]: E1010 08:00:06.970229 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" containerName="collect-profiles" Oct 10 08:00:06 crc kubenswrapper[4822]: I1010 08:00:06.970249 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" containerName="collect-profiles" Oct 10 08:00:06 crc kubenswrapper[4822]: I1010 08:00:06.970549 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" containerName="collect-profiles" Oct 10 08:00:06 crc kubenswrapper[4822]: I1010 08:00:06.971668 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:00:06 crc kubenswrapper[4822]: I1010 08:00:06.980920 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 08:00:06 crc kubenswrapper[4822]: I1010 08:00:06.981512 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.105370 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.105737 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.105949 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-kube-api-access-8bqzn\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.106063 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-scripts\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.106164 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.106303 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208060 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-kube-api-access-8bqzn\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-scripts\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208150 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208192 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208246 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208309 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.208478 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.217912 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.218186 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-scripts\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.219118 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.219562 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.231697 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-kube-api-access-8bqzn\") pod \"cinder-scheduler-0\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.296783 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.782380 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:07 crc kubenswrapper[4822]: I1010 08:00:07.817454 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cd0e3e6-6b37-449e-b29b-e2dfd263a280","Type":"ContainerStarted","Data":"23a3bbab039d62f7de4dc56763b49bff6a7a8ed9e02f564a334ee1fd764ac871"} Oct 10 08:00:08 crc kubenswrapper[4822]: I1010 08:00:08.647840 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:00:08 crc kubenswrapper[4822]: I1010 08:00:08.650357 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api-log" containerID="cri-o://7938c47c7fac6492f36e4367f662d924b3a32cba7c26a0ce7ee3cd4018a472d0" gracePeriod=30 Oct 10 08:00:08 crc kubenswrapper[4822]: I1010 08:00:08.650459 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api" containerID="cri-o://d058dc521bd18cd761eaa9bed847e3e0939fd9bf6ba89fb841a94bd4c73d2223" gracePeriod=30 Oct 10 08:00:08 crc kubenswrapper[4822]: I1010 08:00:08.836081 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cd0e3e6-6b37-449e-b29b-e2dfd263a280","Type":"ContainerStarted","Data":"8d47fa05a23853a52376c8dbb7849b60f34c403130e3fe662dc743d3746fcda1"} Oct 10 08:00:08 crc kubenswrapper[4822]: I1010 08:00:08.839016 4822 generic.go:334] "Generic (PLEG): container finished" podID="10da547f-365f-4f9b-933b-8951adfd60b3" containerID="7938c47c7fac6492f36e4367f662d924b3a32cba7c26a0ce7ee3cd4018a472d0" exitCode=143 Oct 10 08:00:08 crc kubenswrapper[4822]: I1010 08:00:08.839065 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"10da547f-365f-4f9b-933b-8951adfd60b3","Type":"ContainerDied","Data":"7938c47c7fac6492f36e4367f662d924b3a32cba7c26a0ce7ee3cd4018a472d0"} Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.307228 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.309518 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.311410 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.359869 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477545 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfx8h\" (UniqueName: \"kubernetes.io/projected/4dc37285-4d96-4e82-97ab-e162c50877dc-kube-api-access-tfx8h\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477591 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477634 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477655 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477681 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477711 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477731 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477749 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477764 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-run\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477782 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477827 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477853 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dc37285-4d96-4e82-97ab-e162c50877dc-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477895 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477918 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.477934 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582082 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582145 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dc37285-4d96-4e82-97ab-e162c50877dc-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582212 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582256 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582283 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582332 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfx8h\" (UniqueName: \"kubernetes.io/projected/4dc37285-4d96-4e82-97ab-e162c50877dc-kube-api-access-tfx8h\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582359 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582409 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582437 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582470 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582497 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582426 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582510 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582693 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582731 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582864 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.582929 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583045 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583104 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583133 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-run\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583180 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583204 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583219 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-run\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583385 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.583447 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dc37285-4d96-4e82-97ab-e162c50877dc-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.587704 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.588127 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.588727 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.589459 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc37285-4d96-4e82-97ab-e162c50877dc-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.590057 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4dc37285-4d96-4e82-97ab-e162c50877dc-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.607437 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfx8h\" (UniqueName: \"kubernetes.io/projected/4dc37285-4d96-4e82-97ab-e162c50877dc-kube-api-access-tfx8h\") pod \"cinder-volume-volume1-0\" (UID: \"4dc37285-4d96-4e82-97ab-e162c50877dc\") " pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.633046 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.651040 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:00:09 crc kubenswrapper[4822]: E1010 08:00:09.651995 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.919376 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cd0e3e6-6b37-449e-b29b-e2dfd263a280","Type":"ContainerStarted","Data":"622237ec3208990632aa42b833e8a61dbfe14f5b1c20097a9c5daf1f78c292c2"} Oct 10 08:00:09 crc kubenswrapper[4822]: I1010 08:00:09.962408 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.962378117 podStartE2EDuration="3.962378117s" podCreationTimestamp="2025-10-10 08:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:00:09.959323769 +0000 UTC m=+5757.054481975" watchObservedRunningTime="2025-10-10 08:00:09.962378117 +0000 UTC m=+5757.057536313" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.210844 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.216940 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.227402 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.233311 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324518 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e280c73-7378-4070-8375-4ca5f421790a-ceph\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324583 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-run\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324614 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324643 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-scripts\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324667 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-lib-modules\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324845 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-config-data\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324920 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.324952 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325031 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-dev\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325059 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhpf\" (UniqueName: \"kubernetes.io/projected/6e280c73-7378-4070-8375-4ca5f421790a-kube-api-access-mkhpf\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325090 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-sys\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325118 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325141 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325159 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325189 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.325210 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.383457 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 10 08:00:10 crc kubenswrapper[4822]: W1010 08:00:10.385464 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc37285_4d96_4e82_97ab_e162c50877dc.slice/crio-68a9f0d4852e5ec21cd6a58da7d809de7eb33548d5454d524f992aa8fd0f849f WatchSource:0}: Error finding container 68a9f0d4852e5ec21cd6a58da7d809de7eb33548d5454d524f992aa8fd0f849f: Status 404 returned error can't find the container with id 68a9f0d4852e5ec21cd6a58da7d809de7eb33548d5454d524f992aa8fd0f849f Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427058 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-dev\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427122 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhpf\" (UniqueName: \"kubernetes.io/projected/6e280c73-7378-4070-8375-4ca5f421790a-kube-api-access-mkhpf\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427152 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-sys\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427156 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-dev\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427176 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427215 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427239 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427275 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427328 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427349 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427373 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-sys\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427564 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427718 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e280c73-7378-4070-8375-4ca5f421790a-ceph\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427750 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-run\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427787 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427846 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-scripts\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427872 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-lib-modules\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.428009 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-config-data\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.427712 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.428128 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-lib-modules\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.428176 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.428149 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-run\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.429010 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.429081 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.429146 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.429211 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6e280c73-7378-4070-8375-4ca5f421790a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.433350 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-scripts\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.441492 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.441781 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-config-data\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.449232 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e280c73-7378-4070-8375-4ca5f421790a-ceph\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.450239 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e280c73-7378-4070-8375-4ca5f421790a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.452731 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhpf\" (UniqueName: \"kubernetes.io/projected/6e280c73-7378-4070-8375-4ca5f421790a-kube-api-access-mkhpf\") pod \"cinder-backup-0\" (UID: \"6e280c73-7378-4070-8375-4ca5f421790a\") " pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.546262 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 10 08:00:10 crc kubenswrapper[4822]: I1010 08:00:10.930148 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4dc37285-4d96-4e82-97ab-e162c50877dc","Type":"ContainerStarted","Data":"68a9f0d4852e5ec21cd6a58da7d809de7eb33548d5454d524f992aa8fd0f849f"} Oct 10 08:00:11 crc kubenswrapper[4822]: I1010 08:00:11.146383 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 10 08:00:11 crc kubenswrapper[4822]: I1010 08:00:11.815659 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.82:8776/healthcheck\": read tcp 10.217.0.2:54094->10.217.1.82:8776: read: connection reset by peer" Oct 10 08:00:11 crc kubenswrapper[4822]: I1010 08:00:11.945896 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6e280c73-7378-4070-8375-4ca5f421790a","Type":"ContainerStarted","Data":"2ad82c712301bf2c9d05ed49b154b2e6d67db5bbb69d23f4e1c04cbd7b83ede6"} Oct 10 08:00:11 crc kubenswrapper[4822]: I1010 08:00:11.948363 4822 generic.go:334] "Generic (PLEG): container finished" podID="10da547f-365f-4f9b-933b-8951adfd60b3" containerID="d058dc521bd18cd761eaa9bed847e3e0939fd9bf6ba89fb841a94bd4c73d2223" exitCode=0 Oct 10 08:00:11 crc kubenswrapper[4822]: I1010 08:00:11.948428 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"10da547f-365f-4f9b-933b-8951adfd60b3","Type":"ContainerDied","Data":"d058dc521bd18cd761eaa9bed847e3e0939fd9bf6ba89fb841a94bd4c73d2223"} Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.121022 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.122813 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.125520 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.151541 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.151596 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.152661 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.152735 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.159427 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.164270 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.298554 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.408513 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476483 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10da547f-365f-4f9b-933b-8951adfd60b3-etc-machine-id\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476576 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-scripts\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476678 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/10da547f-365f-4f9b-933b-8951adfd60b3-kube-api-access-hz7km\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476665 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10da547f-365f-4f9b-933b-8951adfd60b3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476735 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10da547f-365f-4f9b-933b-8951adfd60b3-logs\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476774 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476817 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data-custom\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.476846 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-combined-ca-bundle\") pod \"10da547f-365f-4f9b-933b-8951adfd60b3\" (UID: \"10da547f-365f-4f9b-933b-8951adfd60b3\") " Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.478362 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10da547f-365f-4f9b-933b-8951adfd60b3-logs" (OuterVolumeSpecName: "logs") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.479640 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10da547f-365f-4f9b-933b-8951adfd60b3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.479664 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10da547f-365f-4f9b-933b-8951adfd60b3-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.482852 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-scripts" (OuterVolumeSpecName: "scripts") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.488593 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.495511 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10da547f-365f-4f9b-933b-8951adfd60b3-kube-api-access-hz7km" (OuterVolumeSpecName: "kube-api-access-hz7km") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "kube-api-access-hz7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.525178 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.570204 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data" (OuterVolumeSpecName: "config-data") pod "10da547f-365f-4f9b-933b-8951adfd60b3" (UID: "10da547f-365f-4f9b-933b-8951adfd60b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.582191 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.582254 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz7km\" (UniqueName: \"kubernetes.io/projected/10da547f-365f-4f9b-933b-8951adfd60b3-kube-api-access-hz7km\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.582277 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.582297 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.582312 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10da547f-365f-4f9b-933b-8951adfd60b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.968530 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"10da547f-365f-4f9b-933b-8951adfd60b3","Type":"ContainerDied","Data":"970a8e4b128fc4ca1a529413eeba4efab3a88643e390967334fa055d26639ff7"} Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.969976 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.970420 4822 scope.go:117] "RemoveContainer" containerID="d058dc521bd18cd761eaa9bed847e3e0939fd9bf6ba89fb841a94bd4c73d2223" Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.981553 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4dc37285-4d96-4e82-97ab-e162c50877dc","Type":"ContainerStarted","Data":"bfd8d16bdc3ce13acbccf1c51dd3f734804f7b131ca76719e2f9178812c9bc99"} Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.981591 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4dc37285-4d96-4e82-97ab-e162c50877dc","Type":"ContainerStarted","Data":"9f6d1f1235748c8b9d1aa64ee2295f0939880d92746155ce292cd042ad608479"} Oct 10 08:00:12 crc kubenswrapper[4822]: I1010 08:00:12.985582 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.014227 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.371491365 podStartE2EDuration="4.014203878s" podCreationTimestamp="2025-10-10 08:00:09 +0000 UTC" firstStartedPulling="2025-10-10 08:00:10.387842324 +0000 UTC m=+5757.483000520" lastFinishedPulling="2025-10-10 08:00:12.030554817 +0000 UTC m=+5759.125713033" observedRunningTime="2025-10-10 08:00:13.006829065 +0000 UTC m=+5760.101987281" watchObservedRunningTime="2025-10-10 08:00:13.014203878 +0000 UTC m=+5760.109362064" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.142630 4822 scope.go:117] "RemoveContainer" containerID="7938c47c7fac6492f36e4367f662d924b3a32cba7c26a0ce7ee3cd4018a472d0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.142845 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.191091 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.204859 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:00:13 crc kubenswrapper[4822]: E1010 08:00:13.205325 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api-log" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.205348 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api-log" Oct 10 08:00:13 crc kubenswrapper[4822]: E1010 08:00:13.205378 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.205386 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.205605 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api-log" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.205637 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" containerName="cinder-api" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.207014 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.211437 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.214681 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.305063 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.305212 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.305552 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-config-data\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.305717 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h77ll\" (UniqueName: \"kubernetes.io/projected/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-kube-api-access-h77ll\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.305960 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-logs\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.306062 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.306181 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-scripts\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.408380 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h77ll\" (UniqueName: \"kubernetes.io/projected/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-kube-api-access-h77ll\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.410032 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-logs\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.410678 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.410599 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-logs\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.411741 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-scripts\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.412014 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.412104 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.412235 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-config-data\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.412287 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.417382 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.418268 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-config-data\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.421864 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-scripts\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.426874 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.433885 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h77ll\" (UniqueName: \"kubernetes.io/projected/1c5d5896-70d2-4754-9d84-a5c6128ba3c5-kube-api-access-h77ll\") pod \"cinder-api-0\" (UID: \"1c5d5896-70d2-4754-9d84-a5c6128ba3c5\") " pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.542101 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:00:13 crc kubenswrapper[4822]: I1010 08:00:13.681844 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10da547f-365f-4f9b-933b-8951adfd60b3" path="/var/lib/kubelet/pods/10da547f-365f-4f9b-933b-8951adfd60b3/volumes" Oct 10 08:00:14 crc kubenswrapper[4822]: I1010 08:00:14.010172 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6e280c73-7378-4070-8375-4ca5f421790a","Type":"ContainerStarted","Data":"edc009666b9ce3cc5bc6dd5a95034c9d98e6c3fc89b128ccbb5bcc79a1166121"} Oct 10 08:00:14 crc kubenswrapper[4822]: I1010 08:00:14.010595 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6e280c73-7378-4070-8375-4ca5f421790a","Type":"ContainerStarted","Data":"62f6c4f7503aeef60d9bf49f575032ffeb334820ae31e3e1218e5536746bbcee"} Oct 10 08:00:14 crc kubenswrapper[4822]: I1010 08:00:14.213767 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.489799057 podStartE2EDuration="4.213741822s" podCreationTimestamp="2025-10-10 08:00:10 +0000 UTC" firstStartedPulling="2025-10-10 08:00:11.169789939 +0000 UTC m=+5758.264948135" lastFinishedPulling="2025-10-10 08:00:12.893732704 +0000 UTC m=+5759.988890900" observedRunningTime="2025-10-10 08:00:14.046347066 +0000 UTC m=+5761.141505272" watchObservedRunningTime="2025-10-10 08:00:14.213741822 +0000 UTC m=+5761.308900018" Oct 10 08:00:14 crc kubenswrapper[4822]: I1010 08:00:14.225618 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:00:14 crc kubenswrapper[4822]: W1010 08:00:14.226652 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5d5896_70d2_4754_9d84_a5c6128ba3c5.slice/crio-21690f6ab7b10e849cfaf9f3e7102c13b645b8ace3a3a3a2cf569dac523ebca9 WatchSource:0}: Error finding container 21690f6ab7b10e849cfaf9f3e7102c13b645b8ace3a3a3a2cf569dac523ebca9: Status 404 returned error can't find the container with id 21690f6ab7b10e849cfaf9f3e7102c13b645b8ace3a3a3a2cf569dac523ebca9 Oct 10 08:00:14 crc kubenswrapper[4822]: I1010 08:00:14.634306 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:15 crc kubenswrapper[4822]: I1010 08:00:15.025799 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c5d5896-70d2-4754-9d84-a5c6128ba3c5","Type":"ContainerStarted","Data":"2bec5f613548a398a0ffe38b29d8b8f42533cbb3b8c460c8df8898efb06f0d24"} Oct 10 08:00:15 crc kubenswrapper[4822]: I1010 08:00:15.026185 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c5d5896-70d2-4754-9d84-a5c6128ba3c5","Type":"ContainerStarted","Data":"21690f6ab7b10e849cfaf9f3e7102c13b645b8ace3a3a3a2cf569dac523ebca9"} Oct 10 08:00:15 crc kubenswrapper[4822]: I1010 08:00:15.548271 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 10 08:00:16 crc kubenswrapper[4822]: I1010 08:00:16.040765 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c5d5896-70d2-4754-9d84-a5c6128ba3c5","Type":"ContainerStarted","Data":"29ed1aa27fbba34f9fffd9122b161097cca0a4bef8fff8ffedda136e6f7bc36c"} Oct 10 08:00:16 crc kubenswrapper[4822]: I1010 08:00:16.041976 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 08:00:16 crc kubenswrapper[4822]: I1010 08:00:16.073164 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.073139902 podStartE2EDuration="3.073139902s" podCreationTimestamp="2025-10-10 08:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:00:16.062108304 +0000 UTC m=+5763.157266510" watchObservedRunningTime="2025-10-10 08:00:16.073139902 +0000 UTC m=+5763.168298098" Oct 10 08:00:17 crc kubenswrapper[4822]: I1010 08:00:17.532852 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 08:00:17 crc kubenswrapper[4822]: I1010 08:00:17.609740 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:18 crc kubenswrapper[4822]: I1010 08:00:18.060010 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="cinder-scheduler" containerID="cri-o://8d47fa05a23853a52376c8dbb7849b60f34c403130e3fe662dc743d3746fcda1" gracePeriod=30 Oct 10 08:00:18 crc kubenswrapper[4822]: I1010 08:00:18.060107 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="probe" containerID="cri-o://622237ec3208990632aa42b833e8a61dbfe14f5b1c20097a9c5daf1f78c292c2" gracePeriod=30 Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.070873 4822 generic.go:334] "Generic (PLEG): container finished" podID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerID="622237ec3208990632aa42b833e8a61dbfe14f5b1c20097a9c5daf1f78c292c2" exitCode=0 Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.071284 4822 generic.go:334] "Generic (PLEG): container finished" podID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerID="8d47fa05a23853a52376c8dbb7849b60f34c403130e3fe662dc743d3746fcda1" exitCode=0 Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.071029 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cd0e3e6-6b37-449e-b29b-e2dfd263a280","Type":"ContainerDied","Data":"622237ec3208990632aa42b833e8a61dbfe14f5b1c20097a9c5daf1f78c292c2"} Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.071338 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cd0e3e6-6b37-449e-b29b-e2dfd263a280","Type":"ContainerDied","Data":"8d47fa05a23853a52376c8dbb7849b60f34c403130e3fe662dc743d3746fcda1"} Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.361222 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.378225 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data-custom\") pod \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.378273 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-etc-machine-id\") pod \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.378409 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-kube-api-access-8bqzn\") pod \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.378437 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data\") pod \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.378455 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-combined-ca-bundle\") pod \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.378487 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-scripts\") pod \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\" (UID: \"9cd0e3e6-6b37-449e-b29b-e2dfd263a280\") " Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.380382 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9cd0e3e6-6b37-449e-b29b-e2dfd263a280" (UID: "9cd0e3e6-6b37-449e-b29b-e2dfd263a280"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.387431 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-scripts" (OuterVolumeSpecName: "scripts") pod "9cd0e3e6-6b37-449e-b29b-e2dfd263a280" (UID: "9cd0e3e6-6b37-449e-b29b-e2dfd263a280"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.391256 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9cd0e3e6-6b37-449e-b29b-e2dfd263a280" (UID: "9cd0e3e6-6b37-449e-b29b-e2dfd263a280"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.400575 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-kube-api-access-8bqzn" (OuterVolumeSpecName: "kube-api-access-8bqzn") pod "9cd0e3e6-6b37-449e-b29b-e2dfd263a280" (UID: "9cd0e3e6-6b37-449e-b29b-e2dfd263a280"). InnerVolumeSpecName "kube-api-access-8bqzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.480565 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-kube-api-access-8bqzn\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.480610 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.480625 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.480640 4822 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.484910 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cd0e3e6-6b37-449e-b29b-e2dfd263a280" (UID: "9cd0e3e6-6b37-449e-b29b-e2dfd263a280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.500105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data" (OuterVolumeSpecName: "config-data") pod "9cd0e3e6-6b37-449e-b29b-e2dfd263a280" (UID: "9cd0e3e6-6b37-449e-b29b-e2dfd263a280"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.582513 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.582548 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0e3e6-6b37-449e-b29b-e2dfd263a280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:19 crc kubenswrapper[4822]: I1010 08:00:19.868924 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.082026 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9cd0e3e6-6b37-449e-b29b-e2dfd263a280","Type":"ContainerDied","Data":"23a3bbab039d62f7de4dc56763b49bff6a7a8ed9e02f564a334ee1fd764ac871"} Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.082379 4822 scope.go:117] "RemoveContainer" containerID="622237ec3208990632aa42b833e8a61dbfe14f5b1c20097a9c5daf1f78c292c2" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.082079 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.119678 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.143188 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.151227 4822 scope.go:117] "RemoveContainer" containerID="8d47fa05a23853a52376c8dbb7849b60f34c403130e3fe662dc743d3746fcda1" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.153035 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:20 crc kubenswrapper[4822]: E1010 08:00:20.153535 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="cinder-scheduler" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.153550 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="cinder-scheduler" Oct 10 08:00:20 crc kubenswrapper[4822]: E1010 08:00:20.153564 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="probe" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.153569 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="probe" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.153769 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="cinder-scheduler" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.153782 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" containerName="probe" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.154885 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.156785 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.176146 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.194762 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.195165 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk675\" (UniqueName: \"kubernetes.io/projected/89d26112-eba6-439d-a781-ead7c951a525-kube-api-access-jk675\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.195226 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89d26112-eba6-439d-a781-ead7c951a525-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.195281 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.195328 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-config-data\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.195370 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-scripts\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.296551 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-config-data\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.296607 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-scripts\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.296645 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.296698 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk675\" (UniqueName: \"kubernetes.io/projected/89d26112-eba6-439d-a781-ead7c951a525-kube-api-access-jk675\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.296730 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89d26112-eba6-439d-a781-ead7c951a525-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.296771 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.297088 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89d26112-eba6-439d-a781-ead7c951a525-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.300893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.301356 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.302198 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-scripts\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.303091 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d26112-eba6-439d-a781-ead7c951a525-config-data\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.316738 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk675\" (UniqueName: \"kubernetes.io/projected/89d26112-eba6-439d-a781-ead7c951a525-kube-api-access-jk675\") pod \"cinder-scheduler-0\" (UID: \"89d26112-eba6-439d-a781-ead7c951a525\") " pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.476897 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.649953 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:00:20 crc kubenswrapper[4822]: E1010 08:00:20.650480 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.831611 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 10 08:00:20 crc kubenswrapper[4822]: I1010 08:00:20.960570 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:00:21 crc kubenswrapper[4822]: I1010 08:00:21.096125 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89d26112-eba6-439d-a781-ead7c951a525","Type":"ContainerStarted","Data":"394671169f5509fa1bf6e435049da6aa5df177d5bb5b4d8857976b3328b466d2"} Oct 10 08:00:21 crc kubenswrapper[4822]: I1010 08:00:21.665292 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd0e3e6-6b37-449e-b29b-e2dfd263a280" path="/var/lib/kubelet/pods/9cd0e3e6-6b37-449e-b29b-e2dfd263a280/volumes" Oct 10 08:00:22 crc kubenswrapper[4822]: I1010 08:00:22.108687 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89d26112-eba6-439d-a781-ead7c951a525","Type":"ContainerStarted","Data":"362261174ea0db6a1b06da0383732c2d4458bd76df3534ed1a89f6813fb86b1d"} Oct 10 08:00:23 crc kubenswrapper[4822]: I1010 08:00:23.119785 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89d26112-eba6-439d-a781-ead7c951a525","Type":"ContainerStarted","Data":"775f111d4b241ac706e1080a8993c10e88de8ec377be17d641ef19ae8c9dfbbd"} Oct 10 08:00:23 crc kubenswrapper[4822]: I1010 08:00:23.143497 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.143480792 podStartE2EDuration="3.143480792s" podCreationTimestamp="2025-10-10 08:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:00:23.142925076 +0000 UTC m=+5770.238083272" watchObservedRunningTime="2025-10-10 08:00:23.143480792 +0000 UTC m=+5770.238638998" Oct 10 08:00:23 crc kubenswrapper[4822]: I1010 08:00:23.915159 4822 scope.go:117] "RemoveContainer" containerID="1822d51afeabbd58543b9b625375c30e8d2f940b6ba6af8d2b69373209e3a4dc" Oct 10 08:00:25 crc kubenswrapper[4822]: I1010 08:00:25.478157 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 08:00:25 crc kubenswrapper[4822]: I1010 08:00:25.576110 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 10 08:00:30 crc kubenswrapper[4822]: I1010 08:00:30.689640 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 08:00:35 crc kubenswrapper[4822]: I1010 08:00:35.651496 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:00:35 crc kubenswrapper[4822]: E1010 08:00:35.652596 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:00:46 crc kubenswrapper[4822]: I1010 08:00:46.078116 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-l6m4k"] Oct 10 08:00:46 crc kubenswrapper[4822]: I1010 08:00:46.094906 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-l6m4k"] Oct 10 08:00:46 crc kubenswrapper[4822]: I1010 08:00:46.651018 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:00:46 crc kubenswrapper[4822]: E1010 08:00:46.651618 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:00:47 crc kubenswrapper[4822]: I1010 08:00:47.669394 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8dcb89-06cd-4756-9558-6181e3c25ee3" path="/var/lib/kubelet/pods/1b8dcb89-06cd-4756-9558-6181e3c25ee3/volumes" Oct 10 08:00:56 crc kubenswrapper[4822]: I1010 08:00:56.044685 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bd67-account-create-d4j8r"] Oct 10 08:00:56 crc kubenswrapper[4822]: I1010 08:00:56.058871 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bd67-account-create-d4j8r"] Oct 10 08:00:57 crc kubenswrapper[4822]: I1010 08:00:57.650791 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:00:57 crc kubenswrapper[4822]: E1010 08:00:57.651152 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:00:57 crc kubenswrapper[4822]: I1010 08:00:57.662480 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2a846e-642f-4748-ae24-79441a7c078e" path="/var/lib/kubelet/pods/4d2a846e-642f-4748-ae24-79441a7c078e/volumes" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.152726 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29334721-nzg8p"] Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.154523 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.176344 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334721-nzg8p"] Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.284564 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-combined-ca-bundle\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.284942 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-config-data\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.285054 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-fernet-keys\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.285135 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-277sl\" (UniqueName: \"kubernetes.io/projected/fd700c4a-f515-420b-876b-6875148a7725-kube-api-access-277sl\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.387484 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-277sl\" (UniqueName: \"kubernetes.io/projected/fd700c4a-f515-420b-876b-6875148a7725-kube-api-access-277sl\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.387601 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-combined-ca-bundle\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.387668 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-config-data\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.387842 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-fernet-keys\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.395879 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-combined-ca-bundle\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.396283 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-fernet-keys\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.397341 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-config-data\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.410021 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-277sl\" (UniqueName: \"kubernetes.io/projected/fd700c4a-f515-420b-876b-6875148a7725-kube-api-access-277sl\") pod \"keystone-cron-29334721-nzg8p\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.488640 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:00 crc kubenswrapper[4822]: I1010 08:01:00.969999 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334721-nzg8p"] Oct 10 08:01:01 crc kubenswrapper[4822]: I1010 08:01:01.543482 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334721-nzg8p" event={"ID":"fd700c4a-f515-420b-876b-6875148a7725","Type":"ContainerStarted","Data":"d3204d1a0f3ad53b5a35dbf617abfba1130aaf458cab87861c5f7ecd6a13580a"} Oct 10 08:01:01 crc kubenswrapper[4822]: I1010 08:01:01.545045 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334721-nzg8p" event={"ID":"fd700c4a-f515-420b-876b-6875148a7725","Type":"ContainerStarted","Data":"3f06aaf41594dc92f1c06a66767058d39fc720482aaa9c19d7653bee414a027e"} Oct 10 08:01:01 crc kubenswrapper[4822]: I1010 08:01:01.560338 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29334721-nzg8p" podStartSLOduration=1.5603144960000002 podStartE2EDuration="1.560314496s" podCreationTimestamp="2025-10-10 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:01:01.560063858 +0000 UTC m=+5808.655222114" watchObservedRunningTime="2025-10-10 08:01:01.560314496 +0000 UTC m=+5808.655472712" Oct 10 08:01:03 crc kubenswrapper[4822]: I1010 08:01:03.041393 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-z4slf"] Oct 10 08:01:03 crc kubenswrapper[4822]: I1010 08:01:03.057374 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-z4slf"] Oct 10 08:01:03 crc kubenswrapper[4822]: I1010 08:01:03.564962 4822 generic.go:334] "Generic (PLEG): container finished" podID="fd700c4a-f515-420b-876b-6875148a7725" containerID="d3204d1a0f3ad53b5a35dbf617abfba1130aaf458cab87861c5f7ecd6a13580a" exitCode=0 Oct 10 08:01:03 crc kubenswrapper[4822]: I1010 08:01:03.565012 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334721-nzg8p" event={"ID":"fd700c4a-f515-420b-876b-6875148a7725","Type":"ContainerDied","Data":"d3204d1a0f3ad53b5a35dbf617abfba1130aaf458cab87861c5f7ecd6a13580a"} Oct 10 08:01:03 crc kubenswrapper[4822]: I1010 08:01:03.668093 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a44420-0ca2-4319-bb9f-a7dad4d5f40f" path="/var/lib/kubelet/pods/e0a44420-0ca2-4319-bb9f-a7dad4d5f40f/volumes" Oct 10 08:01:04 crc kubenswrapper[4822]: I1010 08:01:04.954076 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.085487 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-fernet-keys\") pod \"fd700c4a-f515-420b-876b-6875148a7725\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.085679 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-config-data\") pod \"fd700c4a-f515-420b-876b-6875148a7725\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.085835 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-combined-ca-bundle\") pod \"fd700c4a-f515-420b-876b-6875148a7725\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.086008 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-277sl\" (UniqueName: \"kubernetes.io/projected/fd700c4a-f515-420b-876b-6875148a7725-kube-api-access-277sl\") pod \"fd700c4a-f515-420b-876b-6875148a7725\" (UID: \"fd700c4a-f515-420b-876b-6875148a7725\") " Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.100145 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fd700c4a-f515-420b-876b-6875148a7725" (UID: "fd700c4a-f515-420b-876b-6875148a7725"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.100244 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd700c4a-f515-420b-876b-6875148a7725-kube-api-access-277sl" (OuterVolumeSpecName: "kube-api-access-277sl") pod "fd700c4a-f515-420b-876b-6875148a7725" (UID: "fd700c4a-f515-420b-876b-6875148a7725"). InnerVolumeSpecName "kube-api-access-277sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.120488 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd700c4a-f515-420b-876b-6875148a7725" (UID: "fd700c4a-f515-420b-876b-6875148a7725"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.160934 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-config-data" (OuterVolumeSpecName: "config-data") pod "fd700c4a-f515-420b-876b-6875148a7725" (UID: "fd700c4a-f515-420b-876b-6875148a7725"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.188178 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.188214 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.188227 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd700c4a-f515-420b-876b-6875148a7725-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.188240 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-277sl\" (UniqueName: \"kubernetes.io/projected/fd700c4a-f515-420b-876b-6875148a7725-kube-api-access-277sl\") on node \"crc\" DevicePath \"\"" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.593440 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334721-nzg8p" event={"ID":"fd700c4a-f515-420b-876b-6875148a7725","Type":"ContainerDied","Data":"3f06aaf41594dc92f1c06a66767058d39fc720482aaa9c19d7653bee414a027e"} Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.593938 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f06aaf41594dc92f1c06a66767058d39fc720482aaa9c19d7653bee414a027e" Oct 10 08:01:05 crc kubenswrapper[4822]: I1010 08:01:05.593538 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334721-nzg8p" Oct 10 08:01:08 crc kubenswrapper[4822]: I1010 08:01:08.650976 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:01:08 crc kubenswrapper[4822]: E1010 08:01:08.651574 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:01:16 crc kubenswrapper[4822]: I1010 08:01:16.052158 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vrz4n"] Oct 10 08:01:16 crc kubenswrapper[4822]: I1010 08:01:16.061903 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vrz4n"] Oct 10 08:01:17 crc kubenswrapper[4822]: I1010 08:01:17.670488 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4078fd0b-9003-4f1a-b877-c05a5b5752fa" path="/var/lib/kubelet/pods/4078fd0b-9003-4f1a-b877-c05a5b5752fa/volumes" Oct 10 08:01:20 crc kubenswrapper[4822]: I1010 08:01:20.650378 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:01:20 crc kubenswrapper[4822]: E1010 08:01:20.651025 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:01:24 crc kubenswrapper[4822]: I1010 08:01:24.139215 4822 scope.go:117] "RemoveContainer" containerID="a4b792516e305e692468c11066d2dc5008de1c6d34455bf127486abe1dbb99e6" Oct 10 08:01:24 crc kubenswrapper[4822]: I1010 08:01:24.180478 4822 scope.go:117] "RemoveContainer" containerID="af651a343b409b0cf18c4904717d79f766a6c2b2f12e7c2eb0ac609e897e56f0" Oct 10 08:01:24 crc kubenswrapper[4822]: I1010 08:01:24.205216 4822 scope.go:117] "RemoveContainer" containerID="c34c3311fbff5423d0b699c2589e505f4863546d8b8a459d2e2fbba996f779b0" Oct 10 08:01:24 crc kubenswrapper[4822]: I1010 08:01:24.259968 4822 scope.go:117] "RemoveContainer" containerID="61be0a6d5c7c2849ed62d86a0536b615c6874a3718b9c16c5d297be07f10fe99" Oct 10 08:01:32 crc kubenswrapper[4822]: I1010 08:01:32.651055 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:01:32 crc kubenswrapper[4822]: E1010 08:01:32.652477 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:01:44 crc kubenswrapper[4822]: I1010 08:01:44.650609 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:01:44 crc kubenswrapper[4822]: E1010 08:01:44.651378 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:01:58 crc kubenswrapper[4822]: I1010 08:01:58.650546 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:01:58 crc kubenswrapper[4822]: E1010 08:01:58.651495 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:02:12 crc kubenswrapper[4822]: I1010 08:02:12.651525 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:02:12 crc kubenswrapper[4822]: E1010 08:02:12.653136 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.251195 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bckd7"] Oct 10 08:02:14 crc kubenswrapper[4822]: E1010 08:02:14.252839 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd700c4a-f515-420b-876b-6875148a7725" containerName="keystone-cron" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.252903 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd700c4a-f515-420b-876b-6875148a7725" containerName="keystone-cron" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.253169 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd700c4a-f515-420b-876b-6875148a7725" containerName="keystone-cron" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.256397 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.263781 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-k6fff" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.263987 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.277453 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-82prw"] Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.279075 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.296948 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82prw"] Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.311126 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bckd7"] Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.404859 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pdrm\" (UniqueName: \"kubernetes.io/projected/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-kube-api-access-6pdrm\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.404927 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-scripts\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.404963 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80ce8e7a-d332-4f3c-a8ec-30052721c927-scripts\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405214 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-log-ovn\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405289 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-run\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405318 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-run\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405350 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-etc-ovs\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405391 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-lib\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405414 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-log\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405447 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqm7\" (UniqueName: \"kubernetes.io/projected/80ce8e7a-d332-4f3c-a8ec-30052721c927-kube-api-access-zkqm7\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.405483 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-run-ovn\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.508711 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-run\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509138 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-run\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509179 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-run\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509336 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-run\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509355 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-etc-ovs\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509486 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-etc-ovs\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509621 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-lib\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509698 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-lib\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509791 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-log\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509930 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqm7\" (UniqueName: \"kubernetes.io/projected/80ce8e7a-d332-4f3c-a8ec-30052721c927-kube-api-access-zkqm7\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.510070 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-run-ovn\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.510195 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-run-ovn\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.509851 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/80ce8e7a-d332-4f3c-a8ec-30052721c927-var-log\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.510401 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pdrm\" (UniqueName: \"kubernetes.io/projected/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-kube-api-access-6pdrm\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.510504 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-scripts\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.510604 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80ce8e7a-d332-4f3c-a8ec-30052721c927-scripts\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.510779 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-log-ovn\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.511096 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-var-log-ovn\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.513700 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80ce8e7a-d332-4f3c-a8ec-30052721c927-scripts\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.513751 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-scripts\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.531426 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pdrm\" (UniqueName: \"kubernetes.io/projected/c00f6312-7e6e-4afd-a8c9-000088ad9fb4-kube-api-access-6pdrm\") pod \"ovn-controller-82prw\" (UID: \"c00f6312-7e6e-4afd-a8c9-000088ad9fb4\") " pod="openstack/ovn-controller-82prw" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.542278 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqm7\" (UniqueName: \"kubernetes.io/projected/80ce8e7a-d332-4f3c-a8ec-30052721c927-kube-api-access-zkqm7\") pod \"ovn-controller-ovs-bckd7\" (UID: \"80ce8e7a-d332-4f3c-a8ec-30052721c927\") " pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.592605 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:14 crc kubenswrapper[4822]: I1010 08:02:14.605495 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82prw" Oct 10 08:02:15 crc kubenswrapper[4822]: I1010 08:02:15.320304 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82prw"] Oct 10 08:02:15 crc kubenswrapper[4822]: I1010 08:02:15.675642 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bckd7"] Oct 10 08:02:15 crc kubenswrapper[4822]: I1010 08:02:15.869507 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pzzql"] Oct 10 08:02:15 crc kubenswrapper[4822]: I1010 08:02:15.881172 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pzzql"] Oct 10 08:02:15 crc kubenswrapper[4822]: I1010 08:02:15.881273 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:15 crc kubenswrapper[4822]: I1010 08:02:15.883559 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.054612 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-ovs-rundir\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.054694 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-ovn-rundir\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.054744 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-config\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.054941 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9tp\" (UniqueName: \"kubernetes.io/projected/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-kube-api-access-kj9tp\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.157066 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9tp\" (UniqueName: \"kubernetes.io/projected/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-kube-api-access-kj9tp\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.157583 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-ovs-rundir\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.157648 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-ovn-rundir\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.157706 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-config\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.157936 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-ovs-rundir\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.158137 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-ovn-rundir\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.158604 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-config\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.178709 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9tp\" (UniqueName: \"kubernetes.io/projected/8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad-kube-api-access-kj9tp\") pod \"ovn-controller-metrics-pzzql\" (UID: \"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad\") " pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.216791 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pzzql" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.351667 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82prw" event={"ID":"c00f6312-7e6e-4afd-a8c9-000088ad9fb4","Type":"ContainerStarted","Data":"b20fc366d35e2624ba5e24d7e2e6e467fc18d1e92e1f28c1e9f52b35119623b3"} Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.351712 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82prw" event={"ID":"c00f6312-7e6e-4afd-a8c9-000088ad9fb4","Type":"ContainerStarted","Data":"7fdecf6fd48ea2d5454ffb39f53490870a4414fbd6a6d3bef225037091775313"} Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.351827 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-82prw" Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.360194 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bckd7" event={"ID":"80ce8e7a-d332-4f3c-a8ec-30052721c927","Type":"ContainerStarted","Data":"c0e753774684401acd9153adb0d2f18588f2502f2d2ea93e3c120f01b32a5f24"} Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.360524 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bckd7" event={"ID":"80ce8e7a-d332-4f3c-a8ec-30052721c927","Type":"ContainerStarted","Data":"f9493559153f0a628e2b4e0da45931101432e49c2484f312baaa0c451b300133"} Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.383330 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-82prw" podStartSLOduration=2.383303194 podStartE2EDuration="2.383303194s" podCreationTimestamp="2025-10-10 08:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:02:16.370469284 +0000 UTC m=+5883.465627480" watchObservedRunningTime="2025-10-10 08:02:16.383303194 +0000 UTC m=+5883.478461390" Oct 10 08:02:16 crc kubenswrapper[4822]: W1010 08:02:16.742896 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f99b4ad_8bf1_436c_a2a4_ed8064abf4ad.slice/crio-aeb30f968e1ad8d1658fc6c472a6a85a0cba635feef165bbe096e439bf723ca5 WatchSource:0}: Error finding container aeb30f968e1ad8d1658fc6c472a6a85a0cba635feef165bbe096e439bf723ca5: Status 404 returned error can't find the container with id aeb30f968e1ad8d1658fc6c472a6a85a0cba635feef165bbe096e439bf723ca5 Oct 10 08:02:16 crc kubenswrapper[4822]: I1010 08:02:16.745436 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pzzql"] Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.372067 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pzzql" event={"ID":"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad","Type":"ContainerStarted","Data":"256006c8ff6a641463a3c95670b600f172a012431a9f7659687bba02485ef018"} Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.372447 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pzzql" event={"ID":"8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad","Type":"ContainerStarted","Data":"aeb30f968e1ad8d1658fc6c472a6a85a0cba635feef165bbe096e439bf723ca5"} Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.374775 4822 generic.go:334] "Generic (PLEG): container finished" podID="80ce8e7a-d332-4f3c-a8ec-30052721c927" containerID="c0e753774684401acd9153adb0d2f18588f2502f2d2ea93e3c120f01b32a5f24" exitCode=0 Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.374840 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bckd7" event={"ID":"80ce8e7a-d332-4f3c-a8ec-30052721c927","Type":"ContainerDied","Data":"c0e753774684401acd9153adb0d2f18588f2502f2d2ea93e3c120f01b32a5f24"} Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.385108 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-xzrsl"] Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.386840 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.401875 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-xzrsl"] Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.408859 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pzzql" podStartSLOduration=2.408829582 podStartE2EDuration="2.408829582s" podCreationTimestamp="2025-10-10 08:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:02:17.396565748 +0000 UTC m=+5884.491723944" watchObservedRunningTime="2025-10-10 08:02:17.408829582 +0000 UTC m=+5884.503987778" Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.492604 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdnrn\" (UniqueName: \"kubernetes.io/projected/fb74e893-7ef5-491e-8375-722cc4449667-kube-api-access-xdnrn\") pod \"octavia-db-create-xzrsl\" (UID: \"fb74e893-7ef5-491e-8375-722cc4449667\") " pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.594620 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdnrn\" (UniqueName: \"kubernetes.io/projected/fb74e893-7ef5-491e-8375-722cc4449667-kube-api-access-xdnrn\") pod \"octavia-db-create-xzrsl\" (UID: \"fb74e893-7ef5-491e-8375-722cc4449667\") " pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.614276 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdnrn\" (UniqueName: \"kubernetes.io/projected/fb74e893-7ef5-491e-8375-722cc4449667-kube-api-access-xdnrn\") pod \"octavia-db-create-xzrsl\" (UID: \"fb74e893-7ef5-491e-8375-722cc4449667\") " pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:17 crc kubenswrapper[4822]: I1010 08:02:17.706402 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.321503 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-xzrsl"] Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.385384 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xzrsl" event={"ID":"fb74e893-7ef5-491e-8375-722cc4449667","Type":"ContainerStarted","Data":"ebc2f9ad784fa093bc62d3e63132487b2190d330f44b2125da57e6b20a1db756"} Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.388542 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bckd7" event={"ID":"80ce8e7a-d332-4f3c-a8ec-30052721c927","Type":"ContainerStarted","Data":"99cdaa1366bab7cad9f8c84a49868765e0025908405e270c383c94e86b993a64"} Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.388636 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bckd7" event={"ID":"80ce8e7a-d332-4f3c-a8ec-30052721c927","Type":"ContainerStarted","Data":"e6ee04e3d75cff3df5e4c6155d83448df30f5099c0aabaf1a13c3b72dcd16bbf"} Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.388666 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.388680 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:18 crc kubenswrapper[4822]: I1010 08:02:18.412991 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bckd7" podStartSLOduration=4.4129721140000004 podStartE2EDuration="4.412972114s" podCreationTimestamp="2025-10-10 08:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:02:18.411195362 +0000 UTC m=+5885.506353568" watchObservedRunningTime="2025-10-10 08:02:18.412972114 +0000 UTC m=+5885.508130310" Oct 10 08:02:19 crc kubenswrapper[4822]: I1010 08:02:19.397829 4822 generic.go:334] "Generic (PLEG): container finished" podID="fb74e893-7ef5-491e-8375-722cc4449667" containerID="ca00d5e808188ce282ab8ed1846df94d429ea639ec576373609098d8b0c7bd7d" exitCode=0 Oct 10 08:02:19 crc kubenswrapper[4822]: I1010 08:02:19.397966 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xzrsl" event={"ID":"fb74e893-7ef5-491e-8375-722cc4449667","Type":"ContainerDied","Data":"ca00d5e808188ce282ab8ed1846df94d429ea639ec576373609098d8b0c7bd7d"} Oct 10 08:02:20 crc kubenswrapper[4822]: I1010 08:02:20.870710 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:20 crc kubenswrapper[4822]: I1010 08:02:20.976191 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdnrn\" (UniqueName: \"kubernetes.io/projected/fb74e893-7ef5-491e-8375-722cc4449667-kube-api-access-xdnrn\") pod \"fb74e893-7ef5-491e-8375-722cc4449667\" (UID: \"fb74e893-7ef5-491e-8375-722cc4449667\") " Oct 10 08:02:20 crc kubenswrapper[4822]: I1010 08:02:20.984244 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb74e893-7ef5-491e-8375-722cc4449667-kube-api-access-xdnrn" (OuterVolumeSpecName: "kube-api-access-xdnrn") pod "fb74e893-7ef5-491e-8375-722cc4449667" (UID: "fb74e893-7ef5-491e-8375-722cc4449667"). InnerVolumeSpecName "kube-api-access-xdnrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:02:21 crc kubenswrapper[4822]: I1010 08:02:21.079708 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdnrn\" (UniqueName: \"kubernetes.io/projected/fb74e893-7ef5-491e-8375-722cc4449667-kube-api-access-xdnrn\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:21 crc kubenswrapper[4822]: I1010 08:02:21.440833 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-xzrsl" event={"ID":"fb74e893-7ef5-491e-8375-722cc4449667","Type":"ContainerDied","Data":"ebc2f9ad784fa093bc62d3e63132487b2190d330f44b2125da57e6b20a1db756"} Oct 10 08:02:21 crc kubenswrapper[4822]: I1010 08:02:21.440917 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-xzrsl" Oct 10 08:02:21 crc kubenswrapper[4822]: I1010 08:02:21.440927 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebc2f9ad784fa093bc62d3e63132487b2190d330f44b2125da57e6b20a1db756" Oct 10 08:02:24 crc kubenswrapper[4822]: I1010 08:02:24.372089 4822 scope.go:117] "RemoveContainer" containerID="dd599bc48401618093c1c2ddb48bb1c858e74067d738a73f477e0683faeb5120" Oct 10 08:02:24 crc kubenswrapper[4822]: I1010 08:02:24.408452 4822 scope.go:117] "RemoveContainer" containerID="29ace8d0d9db8a4c779fe59823d3232b89eedf7b4cf3542178e8bb889c20b669" Oct 10 08:02:27 crc kubenswrapper[4822]: I1010 08:02:27.651595 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:02:27 crc kubenswrapper[4822]: E1010 08:02:27.652948 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.458944 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-8707-account-create-vrjp5"] Oct 10 08:02:30 crc kubenswrapper[4822]: E1010 08:02:30.459917 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb74e893-7ef5-491e-8375-722cc4449667" containerName="mariadb-database-create" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.459934 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb74e893-7ef5-491e-8375-722cc4449667" containerName="mariadb-database-create" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.460296 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb74e893-7ef5-491e-8375-722cc4449667" containerName="mariadb-database-create" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.461195 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.464075 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.493950 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-8707-account-create-vrjp5"] Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.609694 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thn7\" (UniqueName: \"kubernetes.io/projected/2cd92ddf-425c-4223-b122-93691db8c391-kube-api-access-5thn7\") pod \"octavia-8707-account-create-vrjp5\" (UID: \"2cd92ddf-425c-4223-b122-93691db8c391\") " pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.711311 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5thn7\" (UniqueName: \"kubernetes.io/projected/2cd92ddf-425c-4223-b122-93691db8c391-kube-api-access-5thn7\") pod \"octavia-8707-account-create-vrjp5\" (UID: \"2cd92ddf-425c-4223-b122-93691db8c391\") " pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.733012 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thn7\" (UniqueName: \"kubernetes.io/projected/2cd92ddf-425c-4223-b122-93691db8c391-kube-api-access-5thn7\") pod \"octavia-8707-account-create-vrjp5\" (UID: \"2cd92ddf-425c-4223-b122-93691db8c391\") " pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:30 crc kubenswrapper[4822]: I1010 08:02:30.789977 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:31 crc kubenswrapper[4822]: I1010 08:02:31.330395 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-8707-account-create-vrjp5"] Oct 10 08:02:31 crc kubenswrapper[4822]: I1010 08:02:31.558010 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8707-account-create-vrjp5" event={"ID":"2cd92ddf-425c-4223-b122-93691db8c391","Type":"ContainerStarted","Data":"482bcca52713a1bb97d49d481a621df5d0978353e2bcd4a1cb6e8f1b34120d48"} Oct 10 08:02:31 crc kubenswrapper[4822]: I1010 08:02:31.558365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8707-account-create-vrjp5" event={"ID":"2cd92ddf-425c-4223-b122-93691db8c391","Type":"ContainerStarted","Data":"c0f37717892e6b84e83ea66055da46afb99e8f963c76bb6b55312c56269d76b7"} Oct 10 08:02:31 crc kubenswrapper[4822]: I1010 08:02:31.577183 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-8707-account-create-vrjp5" podStartSLOduration=1.5771652999999999 podStartE2EDuration="1.5771653s" podCreationTimestamp="2025-10-10 08:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:02:31.573958558 +0000 UTC m=+5898.669116764" watchObservedRunningTime="2025-10-10 08:02:31.5771653 +0000 UTC m=+5898.672323496" Oct 10 08:02:32 crc kubenswrapper[4822]: I1010 08:02:32.569435 4822 generic.go:334] "Generic (PLEG): container finished" podID="2cd92ddf-425c-4223-b122-93691db8c391" containerID="482bcca52713a1bb97d49d481a621df5d0978353e2bcd4a1cb6e8f1b34120d48" exitCode=0 Oct 10 08:02:32 crc kubenswrapper[4822]: I1010 08:02:32.569513 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8707-account-create-vrjp5" event={"ID":"2cd92ddf-425c-4223-b122-93691db8c391","Type":"ContainerDied","Data":"482bcca52713a1bb97d49d481a621df5d0978353e2bcd4a1cb6e8f1b34120d48"} Oct 10 08:02:33 crc kubenswrapper[4822]: I1010 08:02:33.982788 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:34 crc kubenswrapper[4822]: I1010 08:02:34.107716 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5thn7\" (UniqueName: \"kubernetes.io/projected/2cd92ddf-425c-4223-b122-93691db8c391-kube-api-access-5thn7\") pod \"2cd92ddf-425c-4223-b122-93691db8c391\" (UID: \"2cd92ddf-425c-4223-b122-93691db8c391\") " Oct 10 08:02:34 crc kubenswrapper[4822]: I1010 08:02:34.116216 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd92ddf-425c-4223-b122-93691db8c391-kube-api-access-5thn7" (OuterVolumeSpecName: "kube-api-access-5thn7") pod "2cd92ddf-425c-4223-b122-93691db8c391" (UID: "2cd92ddf-425c-4223-b122-93691db8c391"). InnerVolumeSpecName "kube-api-access-5thn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:02:34 crc kubenswrapper[4822]: I1010 08:02:34.211032 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5thn7\" (UniqueName: \"kubernetes.io/projected/2cd92ddf-425c-4223-b122-93691db8c391-kube-api-access-5thn7\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:34 crc kubenswrapper[4822]: I1010 08:02:34.588895 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8707-account-create-vrjp5" event={"ID":"2cd92ddf-425c-4223-b122-93691db8c391","Type":"ContainerDied","Data":"c0f37717892e6b84e83ea66055da46afb99e8f963c76bb6b55312c56269d76b7"} Oct 10 08:02:34 crc kubenswrapper[4822]: I1010 08:02:34.588958 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f37717892e6b84e83ea66055da46afb99e8f963c76bb6b55312c56269d76b7" Oct 10 08:02:34 crc kubenswrapper[4822]: I1010 08:02:34.589570 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8707-account-create-vrjp5" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.492328 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-lcs5n"] Oct 10 08:02:36 crc kubenswrapper[4822]: E1010 08:02:36.493132 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd92ddf-425c-4223-b122-93691db8c391" containerName="mariadb-account-create" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.493150 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd92ddf-425c-4223-b122-93691db8c391" containerName="mariadb-account-create" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.493408 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd92ddf-425c-4223-b122-93691db8c391" containerName="mariadb-account-create" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.494228 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.512237 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-lcs5n"] Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.569638 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpp5\" (UniqueName: \"kubernetes.io/projected/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f-kube-api-access-dwpp5\") pod \"octavia-persistence-db-create-lcs5n\" (UID: \"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f\") " pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.675983 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwpp5\" (UniqueName: \"kubernetes.io/projected/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f-kube-api-access-dwpp5\") pod \"octavia-persistence-db-create-lcs5n\" (UID: \"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f\") " pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.705711 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwpp5\" (UniqueName: \"kubernetes.io/projected/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f-kube-api-access-dwpp5\") pod \"octavia-persistence-db-create-lcs5n\" (UID: \"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f\") " pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:36 crc kubenswrapper[4822]: I1010 08:02:36.813250 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:37 crc kubenswrapper[4822]: I1010 08:02:37.349568 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-lcs5n"] Oct 10 08:02:37 crc kubenswrapper[4822]: I1010 08:02:37.626585 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lcs5n" event={"ID":"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f","Type":"ContainerStarted","Data":"04cdb3a6edb7543c92fab6c5e7549b2ca5d6862bb901ea4fd2358f087fb6856b"} Oct 10 08:02:37 crc kubenswrapper[4822]: I1010 08:02:37.626649 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lcs5n" event={"ID":"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f","Type":"ContainerStarted","Data":"09359dd96a400d9e043e8ab0d30fc5144bc511f4e62c5ae2770b6da049013f55"} Oct 10 08:02:37 crc kubenswrapper[4822]: I1010 08:02:37.641025 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-persistence-db-create-lcs5n" podStartSLOduration=1.641005772 podStartE2EDuration="1.641005772s" podCreationTimestamp="2025-10-10 08:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:02:37.640236859 +0000 UTC m=+5904.735395055" watchObservedRunningTime="2025-10-10 08:02:37.641005772 +0000 UTC m=+5904.736163968" Oct 10 08:02:38 crc kubenswrapper[4822]: I1010 08:02:38.653926 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:02:38 crc kubenswrapper[4822]: I1010 08:02:38.654298 4822 generic.go:334] "Generic (PLEG): container finished" podID="b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f" containerID="04cdb3a6edb7543c92fab6c5e7549b2ca5d6862bb901ea4fd2358f087fb6856b" exitCode=0 Oct 10 08:02:38 crc kubenswrapper[4822]: I1010 08:02:38.654332 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lcs5n" event={"ID":"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f","Type":"ContainerDied","Data":"04cdb3a6edb7543c92fab6c5e7549b2ca5d6862bb901ea4fd2358f087fb6856b"} Oct 10 08:02:38 crc kubenswrapper[4822]: E1010 08:02:38.654859 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.063183 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.176148 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwpp5\" (UniqueName: \"kubernetes.io/projected/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f-kube-api-access-dwpp5\") pod \"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f\" (UID: \"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f\") " Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.185300 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f-kube-api-access-dwpp5" (OuterVolumeSpecName: "kube-api-access-dwpp5") pod "b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f" (UID: "b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f"). InnerVolumeSpecName "kube-api-access-dwpp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.280547 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwpp5\" (UniqueName: \"kubernetes.io/projected/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f-kube-api-access-dwpp5\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.694699 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-lcs5n" event={"ID":"b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f","Type":"ContainerDied","Data":"09359dd96a400d9e043e8ab0d30fc5144bc511f4e62c5ae2770b6da049013f55"} Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.694741 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09359dd96a400d9e043e8ab0d30fc5144bc511f4e62c5ae2770b6da049013f55" Oct 10 08:02:40 crc kubenswrapper[4822]: I1010 08:02:40.695161 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-lcs5n" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.448413 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-1b34-account-create-5ls9k"] Oct 10 08:02:48 crc kubenswrapper[4822]: E1010 08:02:48.449734 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f" containerName="mariadb-database-create" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.449754 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f" containerName="mariadb-database-create" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.450034 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f" containerName="mariadb-database-create" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.450961 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.453223 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.461456 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-1b34-account-create-5ls9k"] Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.490327 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnbbb\" (UniqueName: \"kubernetes.io/projected/044fd4b7-7693-4a14-ab68-9003c5ecc759-kube-api-access-cnbbb\") pod \"octavia-1b34-account-create-5ls9k\" (UID: \"044fd4b7-7693-4a14-ab68-9003c5ecc759\") " pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.593331 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbbb\" (UniqueName: \"kubernetes.io/projected/044fd4b7-7693-4a14-ab68-9003c5ecc759-kube-api-access-cnbbb\") pod \"octavia-1b34-account-create-5ls9k\" (UID: \"044fd4b7-7693-4a14-ab68-9003c5ecc759\") " pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.625765 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnbbb\" (UniqueName: \"kubernetes.io/projected/044fd4b7-7693-4a14-ab68-9003c5ecc759-kube-api-access-cnbbb\") pod \"octavia-1b34-account-create-5ls9k\" (UID: \"044fd4b7-7693-4a14-ab68-9003c5ecc759\") " pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:48 crc kubenswrapper[4822]: I1010 08:02:48.810566 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.262656 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-1b34-account-create-5ls9k"] Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.647760 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.671890 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bckd7" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.678021 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-82prw" podUID="c00f6312-7e6e-4afd-a8c9-000088ad9fb4" containerName="ovn-controller" probeResult="failure" output=< Oct 10 08:02:49 crc kubenswrapper[4822]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 08:02:49 crc kubenswrapper[4822]: > Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.794702 4822 generic.go:334] "Generic (PLEG): container finished" podID="044fd4b7-7693-4a14-ab68-9003c5ecc759" containerID="d588d39a8b8b662002493c412e920b10057591498234ff49d0cd35c93e689026" exitCode=0 Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.794787 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-1b34-account-create-5ls9k" event={"ID":"044fd4b7-7693-4a14-ab68-9003c5ecc759","Type":"ContainerDied","Data":"d588d39a8b8b662002493c412e920b10057591498234ff49d0cd35c93e689026"} Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.794885 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-1b34-account-create-5ls9k" event={"ID":"044fd4b7-7693-4a14-ab68-9003c5ecc759","Type":"ContainerStarted","Data":"ab01957bc1467bddd109dabba255af77296e5335f8e3fcd0089061c90941645b"} Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.826334 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-82prw-config-vm2rx"] Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.828607 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.840154 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.856565 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82prw-config-vm2rx"] Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.928599 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-log-ovn\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.928665 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.928697 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tfpz\" (UniqueName: \"kubernetes.io/projected/d821894f-6c3f-4146-8dda-964344f23daa-kube-api-access-6tfpz\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.928780 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-additional-scripts\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.928829 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run-ovn\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:49 crc kubenswrapper[4822]: I1010 08:02:49.928898 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-scripts\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.030609 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-scripts\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.030695 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-log-ovn\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.030738 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.030767 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfpz\" (UniqueName: \"kubernetes.io/projected/d821894f-6c3f-4146-8dda-964344f23daa-kube-api-access-6tfpz\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.030875 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-additional-scripts\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.030907 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run-ovn\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.031130 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.031165 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run-ovn\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.031179 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-log-ovn\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.031552 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-additional-scripts\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.033687 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-scripts\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.052633 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfpz\" (UniqueName: \"kubernetes.io/projected/d821894f-6c3f-4146-8dda-964344f23daa-kube-api-access-6tfpz\") pod \"ovn-controller-82prw-config-vm2rx\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.152976 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.642012 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-82prw-config-vm2rx"] Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.650879 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:02:50 crc kubenswrapper[4822]: E1010 08:02:50.651391 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:02:50 crc kubenswrapper[4822]: I1010 08:02:50.813171 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82prw-config-vm2rx" event={"ID":"d821894f-6c3f-4146-8dda-964344f23daa","Type":"ContainerStarted","Data":"7adf776b6867f69d47281ffd20e34e1839ba91cd901b1d30f904598c2f745a1c"} Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.175932 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.353828 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnbbb\" (UniqueName: \"kubernetes.io/projected/044fd4b7-7693-4a14-ab68-9003c5ecc759-kube-api-access-cnbbb\") pod \"044fd4b7-7693-4a14-ab68-9003c5ecc759\" (UID: \"044fd4b7-7693-4a14-ab68-9003c5ecc759\") " Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.369303 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044fd4b7-7693-4a14-ab68-9003c5ecc759-kube-api-access-cnbbb" (OuterVolumeSpecName: "kube-api-access-cnbbb") pod "044fd4b7-7693-4a14-ab68-9003c5ecc759" (UID: "044fd4b7-7693-4a14-ab68-9003c5ecc759"). InnerVolumeSpecName "kube-api-access-cnbbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.457551 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnbbb\" (UniqueName: \"kubernetes.io/projected/044fd4b7-7693-4a14-ab68-9003c5ecc759-kube-api-access-cnbbb\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.832308 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-1b34-account-create-5ls9k" Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.832313 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-1b34-account-create-5ls9k" event={"ID":"044fd4b7-7693-4a14-ab68-9003c5ecc759","Type":"ContainerDied","Data":"ab01957bc1467bddd109dabba255af77296e5335f8e3fcd0089061c90941645b"} Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.832985 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab01957bc1467bddd109dabba255af77296e5335f8e3fcd0089061c90941645b" Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.837318 4822 generic.go:334] "Generic (PLEG): container finished" podID="d821894f-6c3f-4146-8dda-964344f23daa" containerID="793af02f6b261990a5b68b6f2a67898a2978511b72f67751c0058a9ff50d5f12" exitCode=0 Oct 10 08:02:51 crc kubenswrapper[4822]: I1010 08:02:51.837464 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82prw-config-vm2rx" event={"ID":"d821894f-6c3f-4146-8dda-964344f23daa","Type":"ContainerDied","Data":"793af02f6b261990a5b68b6f2a67898a2978511b72f67751c0058a9ff50d5f12"} Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.260632 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.401959 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-additional-scripts\") pod \"d821894f-6c3f-4146-8dda-964344f23daa\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.402124 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-scripts\") pod \"d821894f-6c3f-4146-8dda-964344f23daa\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.402190 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run\") pod \"d821894f-6c3f-4146-8dda-964344f23daa\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.402507 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run" (OuterVolumeSpecName: "var-run") pod "d821894f-6c3f-4146-8dda-964344f23daa" (UID: "d821894f-6c3f-4146-8dda-964344f23daa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.402699 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tfpz\" (UniqueName: \"kubernetes.io/projected/d821894f-6c3f-4146-8dda-964344f23daa-kube-api-access-6tfpz\") pod \"d821894f-6c3f-4146-8dda-964344f23daa\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.402743 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run-ovn\") pod \"d821894f-6c3f-4146-8dda-964344f23daa\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.402850 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-log-ovn\") pod \"d821894f-6c3f-4146-8dda-964344f23daa\" (UID: \"d821894f-6c3f-4146-8dda-964344f23daa\") " Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403030 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d821894f-6c3f-4146-8dda-964344f23daa" (UID: "d821894f-6c3f-4146-8dda-964344f23daa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403085 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d821894f-6c3f-4146-8dda-964344f23daa" (UID: "d821894f-6c3f-4146-8dda-964344f23daa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403151 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d821894f-6c3f-4146-8dda-964344f23daa" (UID: "d821894f-6c3f-4146-8dda-964344f23daa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403676 4822 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403707 4822 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403728 4822 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.403745 4822 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d821894f-6c3f-4146-8dda-964344f23daa-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.404207 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-scripts" (OuterVolumeSpecName: "scripts") pod "d821894f-6c3f-4146-8dda-964344f23daa" (UID: "d821894f-6c3f-4146-8dda-964344f23daa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.411739 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d821894f-6c3f-4146-8dda-964344f23daa-kube-api-access-6tfpz" (OuterVolumeSpecName: "kube-api-access-6tfpz") pod "d821894f-6c3f-4146-8dda-964344f23daa" (UID: "d821894f-6c3f-4146-8dda-964344f23daa"). InnerVolumeSpecName "kube-api-access-6tfpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.506369 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tfpz\" (UniqueName: \"kubernetes.io/projected/d821894f-6c3f-4146-8dda-964344f23daa-kube-api-access-6tfpz\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.506446 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d821894f-6c3f-4146-8dda-964344f23daa-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.877172 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-82prw-config-vm2rx" event={"ID":"d821894f-6c3f-4146-8dda-964344f23daa","Type":"ContainerDied","Data":"7adf776b6867f69d47281ffd20e34e1839ba91cd901b1d30f904598c2f745a1c"} Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.877216 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7adf776b6867f69d47281ffd20e34e1839ba91cd901b1d30f904598c2f745a1c" Oct 10 08:02:53 crc kubenswrapper[4822]: I1010 08:02:53.877288 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-82prw-config-vm2rx" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.350754 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-82prw-config-vm2rx"] Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.362990 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-82prw-config-vm2rx"] Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.673324 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-82prw" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.905067 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-bf4dc558d-pljb9"] Oct 10 08:02:54 crc kubenswrapper[4822]: E1010 08:02:54.905478 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d821894f-6c3f-4146-8dda-964344f23daa" containerName="ovn-config" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.905497 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d821894f-6c3f-4146-8dda-964344f23daa" containerName="ovn-config" Oct 10 08:02:54 crc kubenswrapper[4822]: E1010 08:02:54.905517 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044fd4b7-7693-4a14-ab68-9003c5ecc759" containerName="mariadb-account-create" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.905523 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="044fd4b7-7693-4a14-ab68-9003c5ecc759" containerName="mariadb-account-create" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.905724 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d821894f-6c3f-4146-8dda-964344f23daa" containerName="ovn-config" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.905737 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="044fd4b7-7693-4a14-ab68-9003c5ecc759" containerName="mariadb-account-create" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.908299 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.910426 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.910636 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-k9hv9" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.912486 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 10 08:02:54 crc kubenswrapper[4822]: I1010 08:02:54.931510 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-bf4dc558d-pljb9"] Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.036603 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/14451cd9-53f4-44f9-987c-d7764a65543d-octavia-run\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.036747 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/14451cd9-53f4-44f9-987c-d7764a65543d-config-data-merged\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.036773 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-scripts\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.036907 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-combined-ca-bundle\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.036943 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-config-data\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.138690 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/14451cd9-53f4-44f9-987c-d7764a65543d-config-data-merged\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.138751 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-scripts\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.138874 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-combined-ca-bundle\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.138918 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-config-data\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.138965 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/14451cd9-53f4-44f9-987c-d7764a65543d-octavia-run\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.139708 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/14451cd9-53f4-44f9-987c-d7764a65543d-octavia-run\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.142949 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/14451cd9-53f4-44f9-987c-d7764a65543d-config-data-merged\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.147365 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-config-data\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.150597 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-combined-ca-bundle\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.164432 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14451cd9-53f4-44f9-987c-d7764a65543d-scripts\") pod \"octavia-api-bf4dc558d-pljb9\" (UID: \"14451cd9-53f4-44f9-987c-d7764a65543d\") " pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.228487 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.667637 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d821894f-6c3f-4146-8dda-964344f23daa" path="/var/lib/kubelet/pods/d821894f-6c3f-4146-8dda-964344f23daa/volumes" Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.744751 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-bf4dc558d-pljb9"] Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.766566 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:02:55 crc kubenswrapper[4822]: I1010 08:02:55.904431 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bf4dc558d-pljb9" event={"ID":"14451cd9-53f4-44f9-987c-d7764a65543d","Type":"ContainerStarted","Data":"478cb966d1728ab0a45dd422520cba86760a82dfd4e568cd2ada169ef0112c80"} Oct 10 08:03:03 crc kubenswrapper[4822]: I1010 08:03:03.658379 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:03:03 crc kubenswrapper[4822]: E1010 08:03:03.659594 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:03:06 crc kubenswrapper[4822]: E1010 08:03:06.044128 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14451cd9_53f4_44f9_987c_d7764a65543d.slice/crio-conmon-234cb275a60699dcbb734253bd49829ce4771286f2b8db8469267c5591bc2237.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:03:06 crc kubenswrapper[4822]: I1010 08:03:06.067730 4822 generic.go:334] "Generic (PLEG): container finished" podID="14451cd9-53f4-44f9-987c-d7764a65543d" containerID="234cb275a60699dcbb734253bd49829ce4771286f2b8db8469267c5591bc2237" exitCode=0 Oct 10 08:03:06 crc kubenswrapper[4822]: I1010 08:03:06.067792 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bf4dc558d-pljb9" event={"ID":"14451cd9-53f4-44f9-987c-d7764a65543d","Type":"ContainerDied","Data":"234cb275a60699dcbb734253bd49829ce4771286f2b8db8469267c5591bc2237"} Oct 10 08:03:07 crc kubenswrapper[4822]: I1010 08:03:07.081512 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bf4dc558d-pljb9" event={"ID":"14451cd9-53f4-44f9-987c-d7764a65543d","Type":"ContainerStarted","Data":"e6603a2a4be37b48ffb6618699800f31ae1547093a71781def8d761ebccc3061"} Oct 10 08:03:07 crc kubenswrapper[4822]: I1010 08:03:07.082178 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:03:07 crc kubenswrapper[4822]: I1010 08:03:07.082199 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:03:07 crc kubenswrapper[4822]: I1010 08:03:07.082210 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-bf4dc558d-pljb9" event={"ID":"14451cd9-53f4-44f9-987c-d7764a65543d","Type":"ContainerStarted","Data":"0aca25a23ee465cdc2d7ff1e91c84e50b0732ab2b2f646831c16dfabdccc8af0"} Oct 10 08:03:07 crc kubenswrapper[4822]: I1010 08:03:07.108499 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-bf4dc558d-pljb9" podStartSLOduration=3.602189549 podStartE2EDuration="13.108464942s" podCreationTimestamp="2025-10-10 08:02:54 +0000 UTC" firstStartedPulling="2025-10-10 08:02:55.766194774 +0000 UTC m=+5922.861352970" lastFinishedPulling="2025-10-10 08:03:05.272470167 +0000 UTC m=+5932.367628363" observedRunningTime="2025-10-10 08:03:07.10561897 +0000 UTC m=+5934.200777186" watchObservedRunningTime="2025-10-10 08:03:07.108464942 +0000 UTC m=+5934.203623138" Oct 10 08:03:14 crc kubenswrapper[4822]: I1010 08:03:14.651524 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:03:14 crc kubenswrapper[4822]: E1010 08:03:14.652930 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.520850 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-wcrxk"] Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.523960 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.526385 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.526461 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.528119 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.537152 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-wcrxk"] Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.558654 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e63af2a0-2c50-4576-971d-b276062144d6-config-data-merged\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.558995 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e63af2a0-2c50-4576-971d-b276062144d6-hm-ports\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.559198 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63af2a0-2c50-4576-971d-b276062144d6-scripts\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.559324 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63af2a0-2c50-4576-971d-b276062144d6-config-data\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.661840 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63af2a0-2c50-4576-971d-b276062144d6-scripts\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.662343 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63af2a0-2c50-4576-971d-b276062144d6-config-data\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.662547 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e63af2a0-2c50-4576-971d-b276062144d6-config-data-merged\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.662663 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e63af2a0-2c50-4576-971d-b276062144d6-hm-ports\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.663639 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e63af2a0-2c50-4576-971d-b276062144d6-config-data-merged\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.665788 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e63af2a0-2c50-4576-971d-b276062144d6-hm-ports\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.669934 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63af2a0-2c50-4576-971d-b276062144d6-scripts\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.670280 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63af2a0-2c50-4576-971d-b276062144d6-config-data\") pod \"octavia-rsyslog-wcrxk\" (UID: \"e63af2a0-2c50-4576-971d-b276062144d6\") " pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:24 crc kubenswrapper[4822]: I1010 08:03:24.848762 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.333198 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-4gwkl"] Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.343103 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.349911 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.359109 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-4gwkl"] Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.482133 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-httpd-config\") pod \"octavia-image-upload-59f8cff499-4gwkl\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.482576 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-amphora-image\") pod \"octavia-image-upload-59f8cff499-4gwkl\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.585049 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-amphora-image\") pod \"octavia-image-upload-59f8cff499-4gwkl\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.585157 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-httpd-config\") pod \"octavia-image-upload-59f8cff499-4gwkl\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.585947 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-amphora-image\") pod \"octavia-image-upload-59f8cff499-4gwkl\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.640348 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-httpd-config\") pod \"octavia-image-upload-59f8cff499-4gwkl\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:25 crc kubenswrapper[4822]: I1010 08:03:25.680303 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:03:26 crc kubenswrapper[4822]: I1010 08:03:26.105542 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-wcrxk"] Oct 10 08:03:26 crc kubenswrapper[4822]: I1010 08:03:26.319545 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wcrxk" event={"ID":"e63af2a0-2c50-4576-971d-b276062144d6","Type":"ContainerStarted","Data":"94700a57cb26667f0a66aee5d301222ab064a1cba284acdbffa0a16b937ca865"} Oct 10 08:03:26 crc kubenswrapper[4822]: I1010 08:03:26.597857 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-4gwkl"] Oct 10 08:03:26 crc kubenswrapper[4822]: I1010 08:03:26.650821 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:03:26 crc kubenswrapper[4822]: E1010 08:03:26.651031 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.340634 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" event={"ID":"d6d52e62-799c-46fb-9e4f-3de59e66b7eb","Type":"ContainerStarted","Data":"0cac96d19a7291131e1fbef32331ee8b933aa540efc4d0cd05facbfb5452334d"} Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.391596 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-k7mlg"] Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.394155 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.400763 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.416606 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-k7mlg"] Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.554092 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-scripts\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.554244 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data-merged\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.554583 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.554769 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-combined-ca-bundle\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.658332 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-scripts\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.658503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data-merged\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.658577 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.658655 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-combined-ca-bundle\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.659339 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data-merged\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.670202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.676485 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-scripts\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.698854 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-combined-ca-bundle\") pod \"octavia-db-sync-k7mlg\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:27 crc kubenswrapper[4822]: I1010 08:03:27.744699 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:28 crc kubenswrapper[4822]: I1010 08:03:28.629733 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-k7mlg"] Oct 10 08:03:28 crc kubenswrapper[4822]: W1010 08:03:28.801061 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7877f7a6_99da_42ea_9e23_57b3ac4612de.slice/crio-2473d6c4d7db0e316929317680219bd599b8b71ec111335436655b0c1c6065f6 WatchSource:0}: Error finding container 2473d6c4d7db0e316929317680219bd599b8b71ec111335436655b0c1c6065f6: Status 404 returned error can't find the container with id 2473d6c4d7db0e316929317680219bd599b8b71ec111335436655b0c1c6065f6 Oct 10 08:03:29 crc kubenswrapper[4822]: I1010 08:03:29.372283 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-k7mlg" event={"ID":"7877f7a6-99da-42ea-9e23-57b3ac4612de","Type":"ContainerStarted","Data":"2473d6c4d7db0e316929317680219bd599b8b71ec111335436655b0c1c6065f6"} Oct 10 08:03:30 crc kubenswrapper[4822]: I1010 08:03:30.162630 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:03:30 crc kubenswrapper[4822]: I1010 08:03:30.386329 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wcrxk" event={"ID":"e63af2a0-2c50-4576-971d-b276062144d6","Type":"ContainerStarted","Data":"5924e2c4a93d1f2386bf31bcbd682cf139f67af11fce18b7511c995252d28f65"} Oct 10 08:03:30 crc kubenswrapper[4822]: I1010 08:03:30.390954 4822 generic.go:334] "Generic (PLEG): container finished" podID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerID="58da1e8080e70ed5b34ed9647cfebc630f63aa6446632666982b3185eed1bc11" exitCode=0 Oct 10 08:03:30 crc kubenswrapper[4822]: I1010 08:03:30.391005 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-k7mlg" event={"ID":"7877f7a6-99da-42ea-9e23-57b3ac4612de","Type":"ContainerDied","Data":"58da1e8080e70ed5b34ed9647cfebc630f63aa6446632666982b3185eed1bc11"} Oct 10 08:03:30 crc kubenswrapper[4822]: I1010 08:03:30.566910 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-bf4dc558d-pljb9" Oct 10 08:03:31 crc kubenswrapper[4822]: I1010 08:03:31.407699 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-k7mlg" event={"ID":"7877f7a6-99da-42ea-9e23-57b3ac4612de","Type":"ContainerStarted","Data":"775f60e87eee22eb77e41cb2b5e36d4b6186213609467c37676e194dc6574eb5"} Oct 10 08:03:31 crc kubenswrapper[4822]: I1010 08:03:31.426158 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-k7mlg" podStartSLOduration=4.426135328 podStartE2EDuration="4.426135328s" podCreationTimestamp="2025-10-10 08:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:03:31.421775422 +0000 UTC m=+5958.516933628" watchObservedRunningTime="2025-10-10 08:03:31.426135328 +0000 UTC m=+5958.521293534" Oct 10 08:03:31 crc kubenswrapper[4822]: I1010 08:03:31.995654 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5fwq"] Oct 10 08:03:31 crc kubenswrapper[4822]: I1010 08:03:31.999025 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.040199 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5fwq"] Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.179584 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-utilities\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.179639 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-catalog-content\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.179975 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwl4c\" (UniqueName: \"kubernetes.io/projected/7b0bb860-632e-4ab5-94b5-a022a3081da0-kube-api-access-kwl4c\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.282527 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-utilities\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.282590 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-catalog-content\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.282672 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwl4c\" (UniqueName: \"kubernetes.io/projected/7b0bb860-632e-4ab5-94b5-a022a3081da0-kube-api-access-kwl4c\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.283237 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-utilities\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.283302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-catalog-content\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.307474 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwl4c\" (UniqueName: \"kubernetes.io/projected/7b0bb860-632e-4ab5-94b5-a022a3081da0-kube-api-access-kwl4c\") pod \"redhat-marketplace-d5fwq\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.320860 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.438692 4822 generic.go:334] "Generic (PLEG): container finished" podID="e63af2a0-2c50-4576-971d-b276062144d6" containerID="5924e2c4a93d1f2386bf31bcbd682cf139f67af11fce18b7511c995252d28f65" exitCode=0 Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.439908 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wcrxk" event={"ID":"e63af2a0-2c50-4576-971d-b276062144d6","Type":"ContainerDied","Data":"5924e2c4a93d1f2386bf31bcbd682cf139f67af11fce18b7511c995252d28f65"} Oct 10 08:03:32 crc kubenswrapper[4822]: I1010 08:03:32.926030 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5fwq"] Oct 10 08:03:33 crc kubenswrapper[4822]: I1010 08:03:33.451243 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerStarted","Data":"310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc"} Oct 10 08:03:33 crc kubenswrapper[4822]: I1010 08:03:33.451294 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerStarted","Data":"4b13a9dad4f87fd36d64f0ef8177c92cd3a3c1e8769d45ccae1f672f0a05593f"} Oct 10 08:03:34 crc kubenswrapper[4822]: I1010 08:03:34.475333 4822 generic.go:334] "Generic (PLEG): container finished" podID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerID="310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc" exitCode=0 Oct 10 08:03:34 crc kubenswrapper[4822]: I1010 08:03:34.475607 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerDied","Data":"310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc"} Oct 10 08:03:34 crc kubenswrapper[4822]: I1010 08:03:34.480261 4822 generic.go:334] "Generic (PLEG): container finished" podID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerID="775f60e87eee22eb77e41cb2b5e36d4b6186213609467c37676e194dc6574eb5" exitCode=0 Oct 10 08:03:34 crc kubenswrapper[4822]: I1010 08:03:34.480307 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-k7mlg" event={"ID":"7877f7a6-99da-42ea-9e23-57b3ac4612de","Type":"ContainerDied","Data":"775f60e87eee22eb77e41cb2b5e36d4b6186213609467c37676e194dc6574eb5"} Oct 10 08:03:36 crc kubenswrapper[4822]: I1010 08:03:36.991222 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.088346 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-combined-ca-bundle\") pod \"7877f7a6-99da-42ea-9e23-57b3ac4612de\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.088433 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data\") pod \"7877f7a6-99da-42ea-9e23-57b3ac4612de\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.088560 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data-merged\") pod \"7877f7a6-99da-42ea-9e23-57b3ac4612de\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.088598 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-scripts\") pod \"7877f7a6-99da-42ea-9e23-57b3ac4612de\" (UID: \"7877f7a6-99da-42ea-9e23-57b3ac4612de\") " Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.097254 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data" (OuterVolumeSpecName: "config-data") pod "7877f7a6-99da-42ea-9e23-57b3ac4612de" (UID: "7877f7a6-99da-42ea-9e23-57b3ac4612de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.111602 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-scripts" (OuterVolumeSpecName: "scripts") pod "7877f7a6-99da-42ea-9e23-57b3ac4612de" (UID: "7877f7a6-99da-42ea-9e23-57b3ac4612de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.123984 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "7877f7a6-99da-42ea-9e23-57b3ac4612de" (UID: "7877f7a6-99da-42ea-9e23-57b3ac4612de"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.130574 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7877f7a6-99da-42ea-9e23-57b3ac4612de" (UID: "7877f7a6-99da-42ea-9e23-57b3ac4612de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.191308 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.191358 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.191369 4822 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7877f7a6-99da-42ea-9e23-57b3ac4612de-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.191380 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7877f7a6-99da-42ea-9e23-57b3ac4612de-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.514439 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-k7mlg" event={"ID":"7877f7a6-99da-42ea-9e23-57b3ac4612de","Type":"ContainerDied","Data":"2473d6c4d7db0e316929317680219bd599b8b71ec111335436655b0c1c6065f6"} Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.514476 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2473d6c4d7db0e316929317680219bd599b8b71ec111335436655b0c1c6065f6" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.514567 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-k7mlg" Oct 10 08:03:37 crc kubenswrapper[4822]: I1010 08:03:37.651199 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:03:37 crc kubenswrapper[4822]: E1010 08:03:37.651881 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:03:38 crc kubenswrapper[4822]: I1010 08:03:38.529856 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wcrxk" event={"ID":"e63af2a0-2c50-4576-971d-b276062144d6","Type":"ContainerStarted","Data":"bf07dda320c18294d8b21e8d8968324bdb49505b0a1914190a2104d32f6ad9bd"} Oct 10 08:03:38 crc kubenswrapper[4822]: I1010 08:03:38.531150 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:38 crc kubenswrapper[4822]: I1010 08:03:38.534189 4822 generic.go:334] "Generic (PLEG): container finished" podID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerID="102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799" exitCode=0 Oct 10 08:03:38 crc kubenswrapper[4822]: I1010 08:03:38.534274 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerDied","Data":"102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799"} Oct 10 08:03:38 crc kubenswrapper[4822]: I1010 08:03:38.536323 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" event={"ID":"d6d52e62-799c-46fb-9e4f-3de59e66b7eb","Type":"ContainerStarted","Data":"eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560"} Oct 10 08:03:38 crc kubenswrapper[4822]: I1010 08:03:38.571996 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-wcrxk" podStartSLOduration=3.295791482 podStartE2EDuration="14.571964611s" podCreationTimestamp="2025-10-10 08:03:24 +0000 UTC" firstStartedPulling="2025-10-10 08:03:26.134853935 +0000 UTC m=+5953.230012131" lastFinishedPulling="2025-10-10 08:03:37.411027054 +0000 UTC m=+5964.506185260" observedRunningTime="2025-10-10 08:03:38.560982915 +0000 UTC m=+5965.656141131" watchObservedRunningTime="2025-10-10 08:03:38.571964611 +0000 UTC m=+5965.667122807" Oct 10 08:03:39 crc kubenswrapper[4822]: I1010 08:03:39.050472 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-s92rw"] Oct 10 08:03:39 crc kubenswrapper[4822]: I1010 08:03:39.063143 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-s92rw"] Oct 10 08:03:39 crc kubenswrapper[4822]: I1010 08:03:39.664765 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d" path="/var/lib/kubelet/pods/d95c2e75-5f55-4d7a-94f3-3cdef8e2a93d/volumes" Oct 10 08:03:43 crc kubenswrapper[4822]: I1010 08:03:43.601291 4822 generic.go:334] "Generic (PLEG): container finished" podID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerID="eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560" exitCode=0 Oct 10 08:03:43 crc kubenswrapper[4822]: I1010 08:03:43.601400 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" event={"ID":"d6d52e62-799c-46fb-9e4f-3de59e66b7eb","Type":"ContainerDied","Data":"eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560"} Oct 10 08:03:44 crc kubenswrapper[4822]: I1010 08:03:44.624266 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerStarted","Data":"92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed"} Oct 10 08:03:44 crc kubenswrapper[4822]: I1010 08:03:44.648226 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5fwq" podStartSLOduration=6.791443721 podStartE2EDuration="13.648199081s" podCreationTimestamp="2025-10-10 08:03:31 +0000 UTC" firstStartedPulling="2025-10-10 08:03:36.804099137 +0000 UTC m=+5963.899257333" lastFinishedPulling="2025-10-10 08:03:43.660854497 +0000 UTC m=+5970.756012693" observedRunningTime="2025-10-10 08:03:44.643620099 +0000 UTC m=+5971.738778335" watchObservedRunningTime="2025-10-10 08:03:44.648199081 +0000 UTC m=+5971.743357267" Oct 10 08:03:45 crc kubenswrapper[4822]: I1010 08:03:45.638954 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" event={"ID":"d6d52e62-799c-46fb-9e4f-3de59e66b7eb","Type":"ContainerStarted","Data":"f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2"} Oct 10 08:03:45 crc kubenswrapper[4822]: I1010 08:03:45.665188 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" podStartSLOduration=2.578515956 podStartE2EDuration="20.665162197s" podCreationTimestamp="2025-10-10 08:03:25 +0000 UTC" firstStartedPulling="2025-10-10 08:03:26.618469069 +0000 UTC m=+5953.713627265" lastFinishedPulling="2025-10-10 08:03:44.70511531 +0000 UTC m=+5971.800273506" observedRunningTime="2025-10-10 08:03:45.65554256 +0000 UTC m=+5972.750700766" watchObservedRunningTime="2025-10-10 08:03:45.665162197 +0000 UTC m=+5972.760320393" Oct 10 08:03:48 crc kubenswrapper[4822]: I1010 08:03:48.650680 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:03:48 crc kubenswrapper[4822]: E1010 08:03:48.651839 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:03:49 crc kubenswrapper[4822]: I1010 08:03:49.035292 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-54a5-account-create-prsl6"] Oct 10 08:03:49 crc kubenswrapper[4822]: I1010 08:03:49.043552 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-54a5-account-create-prsl6"] Oct 10 08:03:49 crc kubenswrapper[4822]: I1010 08:03:49.670280 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1302fbd4-38e4-4317-b2ae-ea5b2feec5e3" path="/var/lib/kubelet/pods/1302fbd4-38e4-4317-b2ae-ea5b2feec5e3/volumes" Oct 10 08:03:52 crc kubenswrapper[4822]: I1010 08:03:52.321292 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:52 crc kubenswrapper[4822]: I1010 08:03:52.321822 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:52 crc kubenswrapper[4822]: I1010 08:03:52.377545 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:52 crc kubenswrapper[4822]: I1010 08:03:52.772594 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:52 crc kubenswrapper[4822]: I1010 08:03:52.831448 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5fwq"] Oct 10 08:03:54 crc kubenswrapper[4822]: I1010 08:03:54.747126 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5fwq" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="registry-server" containerID="cri-o://92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed" gracePeriod=2 Oct 10 08:03:54 crc kubenswrapper[4822]: I1010 08:03:54.894140 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-wcrxk" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.349320 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.430514 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-catalog-content\") pod \"7b0bb860-632e-4ab5-94b5-a022a3081da0\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.430574 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwl4c\" (UniqueName: \"kubernetes.io/projected/7b0bb860-632e-4ab5-94b5-a022a3081da0-kube-api-access-kwl4c\") pod \"7b0bb860-632e-4ab5-94b5-a022a3081da0\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.430774 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-utilities\") pod \"7b0bb860-632e-4ab5-94b5-a022a3081da0\" (UID: \"7b0bb860-632e-4ab5-94b5-a022a3081da0\") " Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.432397 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-utilities" (OuterVolumeSpecName: "utilities") pod "7b0bb860-632e-4ab5-94b5-a022a3081da0" (UID: "7b0bb860-632e-4ab5-94b5-a022a3081da0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.437850 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0bb860-632e-4ab5-94b5-a022a3081da0-kube-api-access-kwl4c" (OuterVolumeSpecName: "kube-api-access-kwl4c") pod "7b0bb860-632e-4ab5-94b5-a022a3081da0" (UID: "7b0bb860-632e-4ab5-94b5-a022a3081da0"). InnerVolumeSpecName "kube-api-access-kwl4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.444005 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b0bb860-632e-4ab5-94b5-a022a3081da0" (UID: "7b0bb860-632e-4ab5-94b5-a022a3081da0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.533689 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.533741 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwl4c\" (UniqueName: \"kubernetes.io/projected/7b0bb860-632e-4ab5-94b5-a022a3081da0-kube-api-access-kwl4c\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.533756 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b0bb860-632e-4ab5-94b5-a022a3081da0-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.760447 4822 generic.go:334] "Generic (PLEG): container finished" podID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerID="92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed" exitCode=0 Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.760488 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerDied","Data":"92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed"} Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.760543 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5fwq" event={"ID":"7b0bb860-632e-4ab5-94b5-a022a3081da0","Type":"ContainerDied","Data":"4b13a9dad4f87fd36d64f0ef8177c92cd3a3c1e8769d45ccae1f672f0a05593f"} Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.760573 4822 scope.go:117] "RemoveContainer" containerID="92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.761949 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5fwq" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.790545 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5fwq"] Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.794549 4822 scope.go:117] "RemoveContainer" containerID="102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.801320 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5fwq"] Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.830139 4822 scope.go:117] "RemoveContainer" containerID="310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.862231 4822 scope.go:117] "RemoveContainer" containerID="92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed" Oct 10 08:03:55 crc kubenswrapper[4822]: E1010 08:03:55.862710 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed\": container with ID starting with 92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed not found: ID does not exist" containerID="92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.862756 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed"} err="failed to get container status \"92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed\": rpc error: code = NotFound desc = could not find container \"92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed\": container with ID starting with 92ac02c22a283126d66086fdcb7b92acbe39e1fdb1ef6fca0a64ebfcfe753eed not found: ID does not exist" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.862778 4822 scope.go:117] "RemoveContainer" containerID="102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799" Oct 10 08:03:55 crc kubenswrapper[4822]: E1010 08:03:55.863234 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799\": container with ID starting with 102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799 not found: ID does not exist" containerID="102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.863259 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799"} err="failed to get container status \"102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799\": rpc error: code = NotFound desc = could not find container \"102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799\": container with ID starting with 102e52f59b181400d4fbe4b664b797e50cb0c25cad1b4d1268538c9506eea799 not found: ID does not exist" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.863278 4822 scope.go:117] "RemoveContainer" containerID="310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc" Oct 10 08:03:55 crc kubenswrapper[4822]: E1010 08:03:55.863675 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc\": container with ID starting with 310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc not found: ID does not exist" containerID="310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc" Oct 10 08:03:55 crc kubenswrapper[4822]: I1010 08:03:55.863730 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc"} err="failed to get container status \"310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc\": rpc error: code = NotFound desc = could not find container \"310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc\": container with ID starting with 310c3cc083a9a913eae22b0ab1434a415a504f313e4df90633953db492bd89cc not found: ID does not exist" Oct 10 08:03:56 crc kubenswrapper[4822]: I1010 08:03:56.032156 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-vhdnd"] Oct 10 08:03:56 crc kubenswrapper[4822]: I1010 08:03:56.042883 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-vhdnd"] Oct 10 08:03:57 crc kubenswrapper[4822]: I1010 08:03:57.662997 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d93a63b-47d4-4c9f-8670-e22defaaed84" path="/var/lib/kubelet/pods/6d93a63b-47d4-4c9f-8670-e22defaaed84/volumes" Oct 10 08:03:57 crc kubenswrapper[4822]: I1010 08:03:57.663734 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" path="/var/lib/kubelet/pods/7b0bb860-632e-4ab5-94b5-a022a3081da0/volumes" Oct 10 08:03:59 crc kubenswrapper[4822]: I1010 08:03:59.650460 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:03:59 crc kubenswrapper[4822]: E1010 08:03:59.652134 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.202601 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-4gwkl"] Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.203753 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerName="octavia-amphora-httpd" containerID="cri-o://f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2" gracePeriod=30 Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.843981 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.919578 4822 generic.go:334] "Generic (PLEG): container finished" podID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerID="f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2" exitCode=0 Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.919658 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" event={"ID":"d6d52e62-799c-46fb-9e4f-3de59e66b7eb","Type":"ContainerDied","Data":"f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2"} Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.919709 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" event={"ID":"d6d52e62-799c-46fb-9e4f-3de59e66b7eb","Type":"ContainerDied","Data":"0cac96d19a7291131e1fbef32331ee8b933aa540efc4d0cd05facbfb5452334d"} Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.919739 4822 scope.go:117] "RemoveContainer" containerID="f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.920017 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-4gwkl" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.949876 4822 scope.go:117] "RemoveContainer" containerID="eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.949948 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-amphora-image\") pod \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.950010 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-httpd-config\") pod \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\" (UID: \"d6d52e62-799c-46fb-9e4f-3de59e66b7eb\") " Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.992086 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "d6d52e62-799c-46fb-9e4f-3de59e66b7eb" (UID: "d6d52e62-799c-46fb-9e4f-3de59e66b7eb"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.995708 4822 scope.go:117] "RemoveContainer" containerID="f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2" Oct 10 08:04:08 crc kubenswrapper[4822]: E1010 08:04:08.996453 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2\": container with ID starting with f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2 not found: ID does not exist" containerID="f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.996501 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2"} err="failed to get container status \"f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2\": rpc error: code = NotFound desc = could not find container \"f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2\": container with ID starting with f53b7cfe8e664e9e42dae46237491a3839a73af84f3763583e9b043c2b89b1d2 not found: ID does not exist" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.996526 4822 scope.go:117] "RemoveContainer" containerID="eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560" Oct 10 08:04:08 crc kubenswrapper[4822]: E1010 08:04:08.997241 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560\": container with ID starting with eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560 not found: ID does not exist" containerID="eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560" Oct 10 08:04:08 crc kubenswrapper[4822]: I1010 08:04:08.997292 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560"} err="failed to get container status \"eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560\": rpc error: code = NotFound desc = could not find container \"eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560\": container with ID starting with eb73e28aa53ac9d98aff1c9ed69eb38ec5d70786dacae14f357dbc645a2d4560 not found: ID does not exist" Oct 10 08:04:09 crc kubenswrapper[4822]: I1010 08:04:09.001552 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d6d52e62-799c-46fb-9e4f-3de59e66b7eb" (UID: "d6d52e62-799c-46fb-9e4f-3de59e66b7eb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:04:09 crc kubenswrapper[4822]: I1010 08:04:09.052365 4822 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 10 08:04:09 crc kubenswrapper[4822]: I1010 08:04:09.052612 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6d52e62-799c-46fb-9e4f-3de59e66b7eb-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:04:09 crc kubenswrapper[4822]: I1010 08:04:09.265475 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-4gwkl"] Oct 10 08:04:09 crc kubenswrapper[4822]: I1010 08:04:09.276124 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-4gwkl"] Oct 10 08:04:09 crc kubenswrapper[4822]: I1010 08:04:09.663685 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" path="/var/lib/kubelet/pods/d6d52e62-799c-46fb-9e4f-3de59e66b7eb/volumes" Oct 10 08:04:13 crc kubenswrapper[4822]: I1010 08:04:13.657666 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:04:13 crc kubenswrapper[4822]: E1010 08:04:13.658643 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.342546 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vhlx4"] Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343670 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="registry-server" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343697 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="registry-server" Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343738 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerName="octavia-db-sync" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343745 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerName="octavia-db-sync" Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343763 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerName="init" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343770 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerName="init" Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343779 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerName="octavia-amphora-httpd" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343786 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerName="octavia-amphora-httpd" Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343797 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="extract-utilities" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343819 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="extract-utilities" Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343844 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerName="init" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343850 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerName="init" Oct 10 08:04:14 crc kubenswrapper[4822]: E1010 08:04:14.343865 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="extract-content" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.343873 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="extract-content" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.344101 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7877f7a6-99da-42ea-9e23-57b3ac4612de" containerName="octavia-db-sync" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.344114 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d52e62-799c-46fb-9e4f-3de59e66b7eb" containerName="octavia-amphora-httpd" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.344130 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0bb860-632e-4ab5-94b5-a022a3081da0" containerName="registry-server" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.345707 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.349398 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.352617 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vhlx4"] Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.387796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/635e9734-7d78-4fff-8491-37c1fc8b69e1-httpd-config\") pod \"octavia-image-upload-59f8cff499-vhlx4\" (UID: \"635e9734-7d78-4fff-8491-37c1fc8b69e1\") " pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.388257 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/635e9734-7d78-4fff-8491-37c1fc8b69e1-amphora-image\") pod \"octavia-image-upload-59f8cff499-vhlx4\" (UID: \"635e9734-7d78-4fff-8491-37c1fc8b69e1\") " pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.491236 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/635e9734-7d78-4fff-8491-37c1fc8b69e1-httpd-config\") pod \"octavia-image-upload-59f8cff499-vhlx4\" (UID: \"635e9734-7d78-4fff-8491-37c1fc8b69e1\") " pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.491766 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/635e9734-7d78-4fff-8491-37c1fc8b69e1-amphora-image\") pod \"octavia-image-upload-59f8cff499-vhlx4\" (UID: \"635e9734-7d78-4fff-8491-37c1fc8b69e1\") " pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.492418 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/635e9734-7d78-4fff-8491-37c1fc8b69e1-amphora-image\") pod \"octavia-image-upload-59f8cff499-vhlx4\" (UID: \"635e9734-7d78-4fff-8491-37c1fc8b69e1\") " pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.504023 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/635e9734-7d78-4fff-8491-37c1fc8b69e1-httpd-config\") pod \"octavia-image-upload-59f8cff499-vhlx4\" (UID: \"635e9734-7d78-4fff-8491-37c1fc8b69e1\") " pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:14 crc kubenswrapper[4822]: I1010 08:04:14.673426 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" Oct 10 08:04:15 crc kubenswrapper[4822]: I1010 08:04:15.200057 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vhlx4"] Oct 10 08:04:16 crc kubenswrapper[4822]: I1010 08:04:16.036414 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" event={"ID":"635e9734-7d78-4fff-8491-37c1fc8b69e1","Type":"ContainerStarted","Data":"a9642184ad7e089efd8384fad2e499f558b9eeb8eb9095ca3ed7c74983d2710f"} Oct 10 08:04:17 crc kubenswrapper[4822]: I1010 08:04:17.074249 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" event={"ID":"635e9734-7d78-4fff-8491-37c1fc8b69e1","Type":"ContainerStarted","Data":"f9340a729ab0f07f06c471bee878ad317a42db5745efd656b575c0a7166f3f67"} Oct 10 08:04:18 crc kubenswrapper[4822]: I1010 08:04:18.088580 4822 generic.go:334] "Generic (PLEG): container finished" podID="635e9734-7d78-4fff-8491-37c1fc8b69e1" containerID="f9340a729ab0f07f06c471bee878ad317a42db5745efd656b575c0a7166f3f67" exitCode=0 Oct 10 08:04:18 crc kubenswrapper[4822]: I1010 08:04:18.088630 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" event={"ID":"635e9734-7d78-4fff-8491-37c1fc8b69e1","Type":"ContainerDied","Data":"f9340a729ab0f07f06c471bee878ad317a42db5745efd656b575c0a7166f3f67"} Oct 10 08:04:20 crc kubenswrapper[4822]: I1010 08:04:20.159228 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" event={"ID":"635e9734-7d78-4fff-8491-37c1fc8b69e1","Type":"ContainerStarted","Data":"c6323cdfe5babda2bed033c7b51afd3f51b63119bf2ec3049f7e0f7250740bb1"} Oct 10 08:04:20 crc kubenswrapper[4822]: I1010 08:04:20.198035 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-vhlx4" podStartSLOduration=2.002814496 podStartE2EDuration="6.197968717s" podCreationTimestamp="2025-10-10 08:04:14 +0000 UTC" firstStartedPulling="2025-10-10 08:04:15.211753621 +0000 UTC m=+6002.306911827" lastFinishedPulling="2025-10-10 08:04:19.406907852 +0000 UTC m=+6006.502066048" observedRunningTime="2025-10-10 08:04:20.188839024 +0000 UTC m=+6007.283997230" watchObservedRunningTime="2025-10-10 08:04:20.197968717 +0000 UTC m=+6007.293126913" Oct 10 08:04:24 crc kubenswrapper[4822]: I1010 08:04:24.557085 4822 scope.go:117] "RemoveContainer" containerID="1691843ee79b16d9ed91cbb09ef72834b05b15c06ffc039415197ae009fb180d" Oct 10 08:04:24 crc kubenswrapper[4822]: I1010 08:04:24.588165 4822 scope.go:117] "RemoveContainer" containerID="bd500310655ee0a22d68385992a9fbe711328b89a4c4faf84c54a65b6a45f8c2" Oct 10 08:04:24 crc kubenswrapper[4822]: I1010 08:04:24.620989 4822 scope.go:117] "RemoveContainer" containerID="113da4b3073efe7f9ca2fe6d65dd40b1f912a8bdaf36a9dd554f0c85ad823093" Oct 10 08:04:24 crc kubenswrapper[4822]: I1010 08:04:24.668729 4822 scope.go:117] "RemoveContainer" containerID="18a1a9e7dcf699613f1826b3d76e1d7fb9dece54a10d10986413860ed8077493" Oct 10 08:04:24 crc kubenswrapper[4822]: I1010 08:04:24.693285 4822 scope.go:117] "RemoveContainer" containerID="30a4c47e78346f33907bd0aa49a07c5527d8e4127d8bcc91d22da652ed6ce13d" Oct 10 08:04:24 crc kubenswrapper[4822]: I1010 08:04:24.746416 4822 scope.go:117] "RemoveContainer" containerID="61487c853effe2ba9a0a04584e4ca5faeea24ef7b9e0f73c32cf9a5d061f9830" Oct 10 08:04:25 crc kubenswrapper[4822]: I1010 08:04:25.033542 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qr5vj"] Oct 10 08:04:25 crc kubenswrapper[4822]: I1010 08:04:25.042453 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qr5vj"] Oct 10 08:04:25 crc kubenswrapper[4822]: I1010 08:04:25.665126 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3" path="/var/lib/kubelet/pods/f4d470bc-cf67-4ef1-aabe-e5b175f6b1b3/volumes" Oct 10 08:04:26 crc kubenswrapper[4822]: I1010 08:04:26.650675 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:04:26 crc kubenswrapper[4822]: E1010 08:04:26.651301 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.382210 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-cbvgt"] Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.384625 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.389030 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.389092 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.391150 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.405386 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-cbvgt"] Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.414760 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/17cf8698-d96c-44da-b74d-8c1986940707-config-data-merged\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.414885 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-combined-ca-bundle\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.414927 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-amphora-certs\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.415028 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/17cf8698-d96c-44da-b74d-8c1986940707-hm-ports\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.415069 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-scripts\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.415169 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-config-data\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.518292 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/17cf8698-d96c-44da-b74d-8c1986940707-config-data-merged\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.518957 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-combined-ca-bundle\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.519010 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-amphora-certs\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.519125 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/17cf8698-d96c-44da-b74d-8c1986940707-hm-ports\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.519173 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-scripts\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.519279 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-config-data\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.520774 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/17cf8698-d96c-44da-b74d-8c1986940707-config-data-merged\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.521667 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/17cf8698-d96c-44da-b74d-8c1986940707-hm-ports\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.529747 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-combined-ca-bundle\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.532322 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-scripts\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.542730 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-amphora-certs\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.555720 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cf8698-d96c-44da-b74d-8c1986940707-config-data\") pod \"octavia-healthmanager-cbvgt\" (UID: \"17cf8698-d96c-44da-b74d-8c1986940707\") " pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:32 crc kubenswrapper[4822]: I1010 08:04:32.713314 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:33 crc kubenswrapper[4822]: I1010 08:04:33.344757 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-cbvgt"] Oct 10 08:04:34 crc kubenswrapper[4822]: I1010 08:04:34.348667 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cbvgt" event={"ID":"17cf8698-d96c-44da-b74d-8c1986940707","Type":"ContainerStarted","Data":"e94b4e09e3c4d09b7a0273a181fcd6527da87d36fa6746ea1b1cca13e4d2a6e2"} Oct 10 08:04:34 crc kubenswrapper[4822]: I1010 08:04:34.349360 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cbvgt" event={"ID":"17cf8698-d96c-44da-b74d-8c1986940707","Type":"ContainerStarted","Data":"63b8f3cd7f96c9f32466e4b91bfb1ee3e9a58874c8a426cf2bf98c8cc47f99f0"} Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.055862 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1632-account-create-kbgz7"] Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.066863 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1632-account-create-kbgz7"] Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.664754 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d688cb-da87-41e9-9a40-71b73cd5e4ec" path="/var/lib/kubelet/pods/06d688cb-da87-41e9-9a40-71b73cd5e4ec/volumes" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.922330 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-jqvqw"] Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.927028 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.933082 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.933201 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.939631 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-jqvqw"] Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.955510 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-config-data\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.955768 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-scripts\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.955848 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-combined-ca-bundle\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.956024 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-config-data-merged\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.956132 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-hm-ports\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:35 crc kubenswrapper[4822]: I1010 08:04:35.956664 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-amphora-certs\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.058969 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-amphora-certs\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.059036 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-config-data\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.059104 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-scripts\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.059138 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-combined-ca-bundle\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.059221 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-config-data-merged\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.059273 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-hm-ports\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.060441 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-config-data-merged\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.061463 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-hm-ports\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.068585 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-amphora-certs\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.069265 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-config-data\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.070017 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-combined-ca-bundle\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.070422 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e6cf62-30fb-4506-be4f-1e15c90cfaf1-scripts\") pod \"octavia-housekeeping-jqvqw\" (UID: \"35e6cf62-30fb-4506-be4f-1e15c90cfaf1\") " pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.287419 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.379534 4822 generic.go:334] "Generic (PLEG): container finished" podID="17cf8698-d96c-44da-b74d-8c1986940707" containerID="e94b4e09e3c4d09b7a0273a181fcd6527da87d36fa6746ea1b1cca13e4d2a6e2" exitCode=0 Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.379617 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cbvgt" event={"ID":"17cf8698-d96c-44da-b74d-8c1986940707","Type":"ContainerDied","Data":"e94b4e09e3c4d09b7a0273a181fcd6527da87d36fa6746ea1b1cca13e4d2a6e2"} Oct 10 08:04:36 crc kubenswrapper[4822]: I1010 08:04:36.941389 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-jqvqw"] Oct 10 08:04:36 crc kubenswrapper[4822]: W1010 08:04:36.951242 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35e6cf62_30fb_4506_be4f_1e15c90cfaf1.slice/crio-2eec026cad36582a283e93cd3dd42d58b88d4aa08d28f8cf977999d5976b46c4 WatchSource:0}: Error finding container 2eec026cad36582a283e93cd3dd42d58b88d4aa08d28f8cf977999d5976b46c4: Status 404 returned error can't find the container with id 2eec026cad36582a283e93cd3dd42d58b88d4aa08d28f8cf977999d5976b46c4 Oct 10 08:04:37 crc kubenswrapper[4822]: I1010 08:04:37.400418 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cbvgt" event={"ID":"17cf8698-d96c-44da-b74d-8c1986940707","Type":"ContainerStarted","Data":"63a5404fa266e6171f131636538c2c62d7343a4aa2d395fe19c5e27e49eae2a3"} Oct 10 08:04:37 crc kubenswrapper[4822]: I1010 08:04:37.401739 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:37 crc kubenswrapper[4822]: I1010 08:04:37.402160 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jqvqw" event={"ID":"35e6cf62-30fb-4506-be4f-1e15c90cfaf1","Type":"ContainerStarted","Data":"2eec026cad36582a283e93cd3dd42d58b88d4aa08d28f8cf977999d5976b46c4"} Oct 10 08:04:37 crc kubenswrapper[4822]: I1010 08:04:37.423193 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-cbvgt" podStartSLOduration=5.423165662 podStartE2EDuration="5.423165662s" podCreationTimestamp="2025-10-10 08:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:04:37.420221818 +0000 UTC m=+6024.515380054" watchObservedRunningTime="2025-10-10 08:04:37.423165662 +0000 UTC m=+6024.518323878" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.235955 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-5tqgz"] Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.242030 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.244600 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.246141 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.258486 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5tqgz"] Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.432939 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jqvqw" event={"ID":"35e6cf62-30fb-4506-be4f-1e15c90cfaf1","Type":"ContainerStarted","Data":"9ab9431e96e3fe741f224af390413d28986e7178ac9c325409474f409bbf5e65"} Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.435083 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-config-data\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.435335 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-scripts\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.435673 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/416b6683-2b79-4be3-bb3b-4ece64ea85c4-hm-ports\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.436197 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-amphora-certs\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.436483 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/416b6683-2b79-4be3-bb3b-4ece64ea85c4-config-data-merged\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.436661 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-combined-ca-bundle\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.538238 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-amphora-certs\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.538323 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/416b6683-2b79-4be3-bb3b-4ece64ea85c4-config-data-merged\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.538348 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-combined-ca-bundle\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.538395 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-config-data\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.538418 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-scripts\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.538460 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/416b6683-2b79-4be3-bb3b-4ece64ea85c4-hm-ports\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.539884 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/416b6683-2b79-4be3-bb3b-4ece64ea85c4-hm-ports\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.540690 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/416b6683-2b79-4be3-bb3b-4ece64ea85c4-config-data-merged\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.546868 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-combined-ca-bundle\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.547767 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-amphora-certs\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.548137 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-config-data\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.571935 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/416b6683-2b79-4be3-bb3b-4ece64ea85c4-scripts\") pod \"octavia-worker-5tqgz\" (UID: \"416b6683-2b79-4be3-bb3b-4ece64ea85c4\") " pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.652995 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:04:39 crc kubenswrapper[4822]: I1010 08:04:39.862831 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:40 crc kubenswrapper[4822]: I1010 08:04:40.480690 4822 generic.go:334] "Generic (PLEG): container finished" podID="35e6cf62-30fb-4506-be4f-1e15c90cfaf1" containerID="9ab9431e96e3fe741f224af390413d28986e7178ac9c325409474f409bbf5e65" exitCode=0 Oct 10 08:04:40 crc kubenswrapper[4822]: I1010 08:04:40.480869 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jqvqw" event={"ID":"35e6cf62-30fb-4506-be4f-1e15c90cfaf1","Type":"ContainerDied","Data":"9ab9431e96e3fe741f224af390413d28986e7178ac9c325409474f409bbf5e65"} Oct 10 08:04:40 crc kubenswrapper[4822]: I1010 08:04:40.499333 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"7499bf42536c70e0590eda28c91355142cca10772ff9c8c9df99fd82114164a5"} Oct 10 08:04:40 crc kubenswrapper[4822]: I1010 08:04:40.543472 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5tqgz"] Oct 10 08:04:40 crc kubenswrapper[4822]: W1010 08:04:40.560815 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod416b6683_2b79_4be3_bb3b_4ece64ea85c4.slice/crio-529e1dcbe8a01eb63acdb062a3a731dd4697f1a50d96cf87c1234152a4a0470d WatchSource:0}: Error finding container 529e1dcbe8a01eb63acdb062a3a731dd4697f1a50d96cf87c1234152a4a0470d: Status 404 returned error can't find the container with id 529e1dcbe8a01eb63acdb062a3a731dd4697f1a50d96cf87c1234152a4a0470d Oct 10 08:04:41 crc kubenswrapper[4822]: I1010 08:04:41.532077 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-jqvqw" event={"ID":"35e6cf62-30fb-4506-be4f-1e15c90cfaf1","Type":"ContainerStarted","Data":"357c992736382e6d145e30c81c97436d2d1d62453c64eb8981bcece9b2451669"} Oct 10 08:04:41 crc kubenswrapper[4822]: I1010 08:04:41.533289 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:41 crc kubenswrapper[4822]: I1010 08:04:41.535634 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5tqgz" event={"ID":"416b6683-2b79-4be3-bb3b-4ece64ea85c4","Type":"ContainerStarted","Data":"529e1dcbe8a01eb63acdb062a3a731dd4697f1a50d96cf87c1234152a4a0470d"} Oct 10 08:04:41 crc kubenswrapper[4822]: I1010 08:04:41.567278 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-jqvqw" podStartSLOduration=5.159590969 podStartE2EDuration="6.56725233s" podCreationTimestamp="2025-10-10 08:04:35 +0000 UTC" firstStartedPulling="2025-10-10 08:04:36.954472991 +0000 UTC m=+6024.049631187" lastFinishedPulling="2025-10-10 08:04:38.362134352 +0000 UTC m=+6025.457292548" observedRunningTime="2025-10-10 08:04:41.567150828 +0000 UTC m=+6028.662309054" watchObservedRunningTime="2025-10-10 08:04:41.56725233 +0000 UTC m=+6028.662410526" Oct 10 08:04:43 crc kubenswrapper[4822]: I1010 08:04:43.565425 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5tqgz" event={"ID":"416b6683-2b79-4be3-bb3b-4ece64ea85c4","Type":"ContainerStarted","Data":"78ab2bfd4f5bf21725d2f1ee23cf249668ea113180b1019753112bd1da9c8e9f"} Oct 10 08:04:44 crc kubenswrapper[4822]: I1010 08:04:44.035614 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fklpm"] Oct 10 08:04:44 crc kubenswrapper[4822]: I1010 08:04:44.042622 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fklpm"] Oct 10 08:04:44 crc kubenswrapper[4822]: I1010 08:04:44.575580 4822 generic.go:334] "Generic (PLEG): container finished" podID="416b6683-2b79-4be3-bb3b-4ece64ea85c4" containerID="78ab2bfd4f5bf21725d2f1ee23cf249668ea113180b1019753112bd1da9c8e9f" exitCode=0 Oct 10 08:04:44 crc kubenswrapper[4822]: I1010 08:04:44.575624 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5tqgz" event={"ID":"416b6683-2b79-4be3-bb3b-4ece64ea85c4","Type":"ContainerDied","Data":"78ab2bfd4f5bf21725d2f1ee23cf249668ea113180b1019753112bd1da9c8e9f"} Oct 10 08:04:45 crc kubenswrapper[4822]: I1010 08:04:45.587514 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5tqgz" event={"ID":"416b6683-2b79-4be3-bb3b-4ece64ea85c4","Type":"ContainerStarted","Data":"0a041c7a00223281e8979d873170fde64f657766947ede51f475a341941a055b"} Oct 10 08:04:45 crc kubenswrapper[4822]: I1010 08:04:45.588508 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-5tqgz" Oct 10 08:04:45 crc kubenswrapper[4822]: I1010 08:04:45.621629 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-5tqgz" podStartSLOduration=4.841078261 podStartE2EDuration="6.621609941s" podCreationTimestamp="2025-10-10 08:04:39 +0000 UTC" firstStartedPulling="2025-10-10 08:04:40.566334335 +0000 UTC m=+6027.661492531" lastFinishedPulling="2025-10-10 08:04:42.346866015 +0000 UTC m=+6029.442024211" observedRunningTime="2025-10-10 08:04:45.613087445 +0000 UTC m=+6032.708245671" watchObservedRunningTime="2025-10-10 08:04:45.621609941 +0000 UTC m=+6032.716768137" Oct 10 08:04:45 crc kubenswrapper[4822]: I1010 08:04:45.662607 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfced2e9-1b14-4bc0-a83f-8c3e2c610a76" path="/var/lib/kubelet/pods/bfced2e9-1b14-4bc0-a83f-8c3e2c610a76/volumes" Oct 10 08:04:47 crc kubenswrapper[4822]: I1010 08:04:47.754507 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-cbvgt" Oct 10 08:04:51 crc kubenswrapper[4822]: I1010 08:04:51.325262 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-jqvqw" Oct 10 08:04:54 crc kubenswrapper[4822]: I1010 08:04:54.915044 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-5tqgz" Oct 10 08:05:24 crc kubenswrapper[4822]: I1010 08:05:24.934475 4822 scope.go:117] "RemoveContainer" containerID="f08abf6ab3617a1144bb4d385b1f349f2c3b2188bc2a207accc57473d3f0c887" Oct 10 08:05:25 crc kubenswrapper[4822]: I1010 08:05:25.004995 4822 scope.go:117] "RemoveContainer" containerID="6457dd756c10613695cd6ba6d642b53fc5f5ed98b68e3167c9ca43fb6a36668d" Oct 10 08:05:25 crc kubenswrapper[4822]: I1010 08:05:25.036547 4822 scope.go:117] "RemoveContainer" containerID="a1edfca0e37befb38c753a6c7e4e1dbdcf923f092e54ecec843d06905f96196a" Oct 10 08:05:27 crc kubenswrapper[4822]: I1010 08:05:27.048892 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6ztlk"] Oct 10 08:05:27 crc kubenswrapper[4822]: I1010 08:05:27.059013 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6ztlk"] Oct 10 08:05:27 crc kubenswrapper[4822]: I1010 08:05:27.674404 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1bc353-4be3-4cba-87ac-9cbee0c72e28" path="/var/lib/kubelet/pods/8d1bc353-4be3-4cba-87ac-9cbee0c72e28/volumes" Oct 10 08:05:37 crc kubenswrapper[4822]: I1010 08:05:37.061998 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2435-account-create-dhd7j"] Oct 10 08:05:37 crc kubenswrapper[4822]: I1010 08:05:37.074236 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2435-account-create-dhd7j"] Oct 10 08:05:37 crc kubenswrapper[4822]: I1010 08:05:37.663965 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc" path="/var/lib/kubelet/pods/6d07e32b-7cb3-4bf5-bb0f-a2ca06dc11fc/volumes" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.032758 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-25sj8"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.044693 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-25sj8"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.285831 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fbddcd9df-jpclr"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.323083 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fbddcd9df-jpclr"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.323214 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.326211 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gpqh6" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.326502 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.326687 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.329352 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.395645 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.395971 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-log" containerID="cri-o://07c4a2944eb3eb0497406fc3f71936bc9d7dfb3bfee35c47e482b5f978d16563" gracePeriod=30 Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.396426 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-httpd" containerID="cri-o://92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6" gracePeriod=30 Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.415673 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-848c979cff-kpxhq"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.418441 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.445007 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-848c979cff-kpxhq"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.455333 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48847aa7-6a8b-4d04-a427-edc9aa21743b-horizon-secret-key\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.455413 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48847aa7-6a8b-4d04-a427-edc9aa21743b-logs\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.455546 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-scripts\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.455577 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg587\" (UniqueName: \"kubernetes.io/projected/48847aa7-6a8b-4d04-a427-edc9aa21743b-kube-api-access-rg587\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.455628 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-config-data\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.475782 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.476074 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-log" containerID="cri-o://d2e164305500f7e412ebe8483345263a4d55b63d7b10052cd8fba1d54e6c4751" gracePeriod=30 Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.476185 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-httpd" containerID="cri-o://f110b0d579a08a84d345f309cd94acdd4fd3801b73f320ec5ed1e10d464a4793" gracePeriod=30 Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.557934 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-scripts\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558005 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg587\" (UniqueName: \"kubernetes.io/projected/48847aa7-6a8b-4d04-a427-edc9aa21743b-kube-api-access-rg587\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558096 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7m9\" (UniqueName: \"kubernetes.io/projected/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-kube-api-access-wc7m9\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558131 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-config-data\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558235 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-logs\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558270 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-scripts\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558319 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-config-data\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558387 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48847aa7-6a8b-4d04-a427-edc9aa21743b-horizon-secret-key\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558430 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48847aa7-6a8b-4d04-a427-edc9aa21743b-logs\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.558463 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-horizon-secret-key\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.559700 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48847aa7-6a8b-4d04-a427-edc9aa21743b-logs\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.559792 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-scripts\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.560350 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-config-data\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.567528 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48847aa7-6a8b-4d04-a427-edc9aa21743b-horizon-secret-key\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.575493 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg587\" (UniqueName: \"kubernetes.io/projected/48847aa7-6a8b-4d04-a427-edc9aa21743b-kube-api-access-rg587\") pod \"horizon-6fbddcd9df-jpclr\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.655147 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.659937 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7m9\" (UniqueName: \"kubernetes.io/projected/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-kube-api-access-wc7m9\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.660023 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-logs\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.660053 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-scripts\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.660083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-config-data\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.660140 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-horizon-secret-key\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.660618 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-logs\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.661369 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-scripts\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.662096 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-config-data\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.667064 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae1da02d-0767-411e-bea7-b592b9ea37e1" path="/var/lib/kubelet/pods/ae1da02d-0767-411e-bea7-b592b9ea37e1/volumes" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.669053 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-horizon-secret-key\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.682475 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7m9\" (UniqueName: \"kubernetes.io/projected/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-kube-api-access-wc7m9\") pod \"horizon-848c979cff-kpxhq\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:47 crc kubenswrapper[4822]: I1010 08:05:47.749413 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.043783 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbddcd9df-jpclr"] Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.099965 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bc8479b8f-dzrp9"] Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.103482 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.128366 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bc8479b8f-dzrp9"] Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.178579 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/ef5a91b9-7681-41ac-9a5c-77d041506bea-kube-api-access-vpwr7\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.178683 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef5a91b9-7681-41ac-9a5c-77d041506bea-horizon-secret-key\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.178790 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-config-data\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.179349 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-scripts\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.179418 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef5a91b9-7681-41ac-9a5c-77d041506bea-logs\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.190888 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbddcd9df-jpclr"] Oct 10 08:05:48 crc kubenswrapper[4822]: W1010 08:05:48.204291 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48847aa7_6a8b_4d04_a427_edc9aa21743b.slice/crio-3e7b09db6aa75edc69ac5136203bb1cd330b708f3ac4e401e800588da6579b02 WatchSource:0}: Error finding container 3e7b09db6aa75edc69ac5136203bb1cd330b708f3ac4e401e800588da6579b02: Status 404 returned error can't find the container with id 3e7b09db6aa75edc69ac5136203bb1cd330b708f3ac4e401e800588da6579b02 Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282030 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-scripts\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282423 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef5a91b9-7681-41ac-9a5c-77d041506bea-logs\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282577 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/ef5a91b9-7681-41ac-9a5c-77d041506bea-kube-api-access-vpwr7\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282708 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef5a91b9-7681-41ac-9a5c-77d041506bea-horizon-secret-key\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282833 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef5a91b9-7681-41ac-9a5c-77d041506bea-logs\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282913 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-scripts\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.282973 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-config-data\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.284010 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-config-data\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.289974 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef5a91b9-7681-41ac-9a5c-77d041506bea-horizon-secret-key\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.300237 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/ef5a91b9-7681-41ac-9a5c-77d041506bea-kube-api-access-vpwr7\") pod \"horizon-6bc8479b8f-dzrp9\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.310683 4822 generic.go:334] "Generic (PLEG): container finished" podID="9ccc430e-b717-4005-927a-f062c96fb139" containerID="07c4a2944eb3eb0497406fc3f71936bc9d7dfb3bfee35c47e482b5f978d16563" exitCode=143 Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.310792 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9ccc430e-b717-4005-927a-f062c96fb139","Type":"ContainerDied","Data":"07c4a2944eb3eb0497406fc3f71936bc9d7dfb3bfee35c47e482b5f978d16563"} Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.313525 4822 generic.go:334] "Generic (PLEG): container finished" podID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerID="d2e164305500f7e412ebe8483345263a4d55b63d7b10052cd8fba1d54e6c4751" exitCode=143 Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.313570 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393","Type":"ContainerDied","Data":"d2e164305500f7e412ebe8483345263a4d55b63d7b10052cd8fba1d54e6c4751"} Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.315635 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbddcd9df-jpclr" event={"ID":"48847aa7-6a8b-4d04-a427-edc9aa21743b","Type":"ContainerStarted","Data":"3e7b09db6aa75edc69ac5136203bb1cd330b708f3ac4e401e800588da6579b02"} Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.383203 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-848c979cff-kpxhq"] Oct 10 08:05:48 crc kubenswrapper[4822]: W1010 08:05:48.385961 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ada923a_cb80_427a_94c5_b2a1fe4f2b15.slice/crio-b6a9e13a68de73bee3c4f14ebe47389a3e351416e62bcdd37396c4973932d41b WatchSource:0}: Error finding container b6a9e13a68de73bee3c4f14ebe47389a3e351416e62bcdd37396c4973932d41b: Status 404 returned error can't find the container with id b6a9e13a68de73bee3c4f14ebe47389a3e351416e62bcdd37396c4973932d41b Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.435939 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:48 crc kubenswrapper[4822]: W1010 08:05:48.943172 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef5a91b9_7681_41ac_9a5c_77d041506bea.slice/crio-1efe264c6fd79ed1958ad7d521803f791b8e8f5e55f87710cfcb690a5117ccbd WatchSource:0}: Error finding container 1efe264c6fd79ed1958ad7d521803f791b8e8f5e55f87710cfcb690a5117ccbd: Status 404 returned error can't find the container with id 1efe264c6fd79ed1958ad7d521803f791b8e8f5e55f87710cfcb690a5117ccbd Oct 10 08:05:48 crc kubenswrapper[4822]: I1010 08:05:48.951659 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bc8479b8f-dzrp9"] Oct 10 08:05:49 crc kubenswrapper[4822]: I1010 08:05:49.333238 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-848c979cff-kpxhq" event={"ID":"8ada923a-cb80-427a-94c5-b2a1fe4f2b15","Type":"ContainerStarted","Data":"b6a9e13a68de73bee3c4f14ebe47389a3e351416e62bcdd37396c4973932d41b"} Oct 10 08:05:49 crc kubenswrapper[4822]: I1010 08:05:49.336688 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc8479b8f-dzrp9" event={"ID":"ef5a91b9-7681-41ac-9a5c-77d041506bea","Type":"ContainerStarted","Data":"1efe264c6fd79ed1958ad7d521803f791b8e8f5e55f87710cfcb690a5117ccbd"} Oct 10 08:05:50 crc kubenswrapper[4822]: E1010 08:05:50.778493 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ccc430e_b717_4005_927a_f062c96fb139.slice/crio-92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ccc430e_b717_4005_927a_f062c96fb139.slice/crio-conmon-92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:05:51 crc kubenswrapper[4822]: I1010 08:05:51.364057 4822 generic.go:334] "Generic (PLEG): container finished" podID="9ccc430e-b717-4005-927a-f062c96fb139" containerID="92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6" exitCode=0 Oct 10 08:05:51 crc kubenswrapper[4822]: I1010 08:05:51.364165 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9ccc430e-b717-4005-927a-f062c96fb139","Type":"ContainerDied","Data":"92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6"} Oct 10 08:05:51 crc kubenswrapper[4822]: I1010 08:05:51.368660 4822 generic.go:334] "Generic (PLEG): container finished" podID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerID="f110b0d579a08a84d345f309cd94acdd4fd3801b73f320ec5ed1e10d464a4793" exitCode=0 Oct 10 08:05:51 crc kubenswrapper[4822]: I1010 08:05:51.368722 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393","Type":"ContainerDied","Data":"f110b0d579a08a84d345f309cd94acdd4fd3801b73f320ec5ed1e10d464a4793"} Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.842511 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.852528 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.970931 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-config-data\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971048 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxb2x\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-kube-api-access-vxb2x\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971091 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-ceph\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971158 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-logs\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971271 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzg7l\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-kube-api-access-mzg7l\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971356 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-ceph\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971390 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-httpd-run\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971456 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-combined-ca-bundle\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971523 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-config-data\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971609 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-logs\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971645 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-combined-ca-bundle\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971681 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-scripts\") pod \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\" (UID: \"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971736 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-httpd-run\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.971761 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-scripts\") pod \"9ccc430e-b717-4005-927a-f062c96fb139\" (UID: \"9ccc430e-b717-4005-927a-f062c96fb139\") " Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.977724 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-logs" (OuterVolumeSpecName: "logs") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.978819 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.979354 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-logs" (OuterVolumeSpecName: "logs") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.983286 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.986533 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-scripts" (OuterVolumeSpecName: "scripts") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.986724 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-scripts" (OuterVolumeSpecName: "scripts") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.989595 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-kube-api-access-mzg7l" (OuterVolumeSpecName: "kube-api-access-mzg7l") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "kube-api-access-mzg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.990607 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-kube-api-access-vxb2x" (OuterVolumeSpecName: "kube-api-access-vxb2x") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "kube-api-access-vxb2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:05:55 crc kubenswrapper[4822]: I1010 08:05:55.991893 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-ceph" (OuterVolumeSpecName: "ceph") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.010231 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-ceph" (OuterVolumeSpecName: "ceph") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.030734 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.037510 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076575 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxb2x\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-kube-api-access-vxb2x\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076616 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076627 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076638 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzg7l\" (UniqueName: \"kubernetes.io/projected/9ccc430e-b717-4005-927a-f062c96fb139-kube-api-access-mzg7l\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076648 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076658 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076668 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076678 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076690 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076698 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076706 4822 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ccc430e-b717-4005-927a-f062c96fb139-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.076715 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.084610 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-config-data" (OuterVolumeSpecName: "config-data") pod "9ccc430e-b717-4005-927a-f062c96fb139" (UID: "9ccc430e-b717-4005-927a-f062c96fb139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.122002 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-config-data" (OuterVolumeSpecName: "config-data") pod "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" (UID: "84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.178608 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.178664 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ccc430e-b717-4005-927a-f062c96fb139-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.432173 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc8479b8f-dzrp9" event={"ID":"ef5a91b9-7681-41ac-9a5c-77d041506bea","Type":"ContainerStarted","Data":"469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde"} Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.436565 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9ccc430e-b717-4005-927a-f062c96fb139","Type":"ContainerDied","Data":"e40fafa7b80907f7f22962124936f5097a44a7ec4818d1155e9e7508ab5c691e"} Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.436626 4822 scope.go:117] "RemoveContainer" containerID="92fa7d32107182a58230f272f2fc15657ab114336abeba2ab90f77fc5d7ea4c6" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.436908 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.441973 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.441986 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393","Type":"ContainerDied","Data":"9b2083b3d05efe7390f4fd4f9d2246537865127be7c221b46bf13f1d200b942f"} Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.444524 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbddcd9df-jpclr" event={"ID":"48847aa7-6a8b-4d04-a427-edc9aa21743b","Type":"ContainerStarted","Data":"aaae650c9d44f7b16749f5cab3702d985b0db1b9bda4f6850439de4631735b08"} Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.452627 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-848c979cff-kpxhq" event={"ID":"8ada923a-cb80-427a-94c5-b2a1fe4f2b15","Type":"ContainerStarted","Data":"683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826"} Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.493162 4822 scope.go:117] "RemoveContainer" containerID="07c4a2944eb3eb0497406fc3f71936bc9d7dfb3bfee35c47e482b5f978d16563" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.498601 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.528174 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.542909 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.552008 4822 scope.go:117] "RemoveContainer" containerID="f110b0d579a08a84d345f309cd94acdd4fd3801b73f320ec5ed1e10d464a4793" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.573258 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.583092 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: E1010 08:05:56.583770 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-httpd" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.583816 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-httpd" Oct 10 08:05:56 crc kubenswrapper[4822]: E1010 08:05:56.583858 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-httpd" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.583866 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-httpd" Oct 10 08:05:56 crc kubenswrapper[4822]: E1010 08:05:56.583885 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-log" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.583895 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-log" Oct 10 08:05:56 crc kubenswrapper[4822]: E1010 08:05:56.583910 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-log" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.583918 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-log" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.584125 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-log" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.584146 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-httpd" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.584163 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-log" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.584181 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-httpd" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.585495 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.588005 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.588242 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bc548" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.588393 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.590619 4822 scope.go:117] "RemoveContainer" containerID="d2e164305500f7e412ebe8483345263a4d55b63d7b10052cd8fba1d54e6c4751" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.611581 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.630275 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.632423 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.634534 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.641074 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.706879 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/56faf384-52ff-4e12-ab58-6ced8cf71bc0-ceph\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.706968 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56faf384-52ff-4e12-ab58-6ced8cf71bc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.707218 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.707271 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.707319 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.707342 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56faf384-52ff-4e12-ab58-6ced8cf71bc0-logs\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.707471 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9jl\" (UniqueName: \"kubernetes.io/projected/56faf384-52ff-4e12-ab58-6ced8cf71bc0-kube-api-access-6z9jl\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.809888 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.809949 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56faf384-52ff-4e12-ab58-6ced8cf71bc0-logs\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.810181 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/10e77277-2b4e-4b63-8596-ca5185866f05-ceph\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.810229 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e77277-2b4e-4b63-8596-ca5185866f05-logs\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.810247 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e77277-2b4e-4b63-8596-ca5185866f05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.810893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56faf384-52ff-4e12-ab58-6ced8cf71bc0-logs\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811615 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811692 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9jl\" (UniqueName: \"kubernetes.io/projected/56faf384-52ff-4e12-ab58-6ced8cf71bc0-kube-api-access-6z9jl\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811733 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/56faf384-52ff-4e12-ab58-6ced8cf71bc0-ceph\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811781 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811827 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56faf384-52ff-4e12-ab58-6ced8cf71bc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811852 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4x2r\" (UniqueName: \"kubernetes.io/projected/10e77277-2b4e-4b63-8596-ca5185866f05-kube-api-access-k4x2r\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811884 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.811962 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.812029 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.813265 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56faf384-52ff-4e12-ab58-6ced8cf71bc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.814948 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.817237 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.822148 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56faf384-52ff-4e12-ab58-6ced8cf71bc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.825354 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/56faf384-52ff-4e12-ab58-6ced8cf71bc0-ceph\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.834170 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9jl\" (UniqueName: \"kubernetes.io/projected/56faf384-52ff-4e12-ab58-6ced8cf71bc0-kube-api-access-6z9jl\") pod \"glance-default-external-api-0\" (UID: \"56faf384-52ff-4e12-ab58-6ced8cf71bc0\") " pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.913515 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.914623 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e77277-2b4e-4b63-8596-ca5185866f05-logs\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.914673 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e77277-2b4e-4b63-8596-ca5185866f05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.914832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.914869 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.914893 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4x2r\" (UniqueName: \"kubernetes.io/projected/10e77277-2b4e-4b63-8596-ca5185866f05-kube-api-access-k4x2r\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.914925 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.915007 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/10e77277-2b4e-4b63-8596-ca5185866f05-ceph\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.915553 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e77277-2b4e-4b63-8596-ca5185866f05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.915547 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e77277-2b4e-4b63-8596-ca5185866f05-logs\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.920292 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.920424 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/10e77277-2b4e-4b63-8596-ca5185866f05-ceph\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.920843 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.936314 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e77277-2b4e-4b63-8596-ca5185866f05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.939042 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4x2r\" (UniqueName: \"kubernetes.io/projected/10e77277-2b4e-4b63-8596-ca5185866f05-kube-api-access-k4x2r\") pod \"glance-default-internal-api-0\" (UID: \"10e77277-2b4e-4b63-8596-ca5185866f05\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:05:56 crc kubenswrapper[4822]: I1010 08:05:56.959380 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.462629 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbddcd9df-jpclr" event={"ID":"48847aa7-6a8b-4d04-a427-edc9aa21743b","Type":"ContainerStarted","Data":"bd69a984a9ed5778c5cf62ad5f6b0532eb1330edeb59889f3fbfefcb7a785a76"} Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.462766 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fbddcd9df-jpclr" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon" containerID="cri-o://bd69a984a9ed5778c5cf62ad5f6b0532eb1330edeb59889f3fbfefcb7a785a76" gracePeriod=30 Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.462795 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fbddcd9df-jpclr" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon-log" containerID="cri-o://aaae650c9d44f7b16749f5cab3702d985b0db1b9bda4f6850439de4631735b08" gracePeriod=30 Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.476169 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-848c979cff-kpxhq" event={"ID":"8ada923a-cb80-427a-94c5-b2a1fe4f2b15","Type":"ContainerStarted","Data":"bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229"} Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.485065 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc8479b8f-dzrp9" event={"ID":"ef5a91b9-7681-41ac-9a5c-77d041506bea","Type":"ContainerStarted","Data":"9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e"} Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.498057 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fbddcd9df-jpclr" podStartSLOduration=2.909772152 podStartE2EDuration="10.49802544s" podCreationTimestamp="2025-10-10 08:05:47 +0000 UTC" firstStartedPulling="2025-10-10 08:05:48.209466255 +0000 UTC m=+6095.304624451" lastFinishedPulling="2025-10-10 08:05:55.797719533 +0000 UTC m=+6102.892877739" observedRunningTime="2025-10-10 08:05:57.491035678 +0000 UTC m=+6104.586193874" watchObservedRunningTime="2025-10-10 08:05:57.49802544 +0000 UTC m=+6104.593183636" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.550743 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6bc8479b8f-dzrp9" podStartSLOduration=2.6651354879999998 podStartE2EDuration="9.550723719s" podCreationTimestamp="2025-10-10 08:05:48 +0000 UTC" firstStartedPulling="2025-10-10 08:05:48.953774832 +0000 UTC m=+6096.048933028" lastFinishedPulling="2025-10-10 08:05:55.839363073 +0000 UTC m=+6102.934521259" observedRunningTime="2025-10-10 08:05:57.525208783 +0000 UTC m=+6104.620366979" watchObservedRunningTime="2025-10-10 08:05:57.550723719 +0000 UTC m=+6104.645881915" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.561870 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-848c979cff-kpxhq" podStartSLOduration=3.149201734 podStartE2EDuration="10.561840239s" podCreationTimestamp="2025-10-10 08:05:47 +0000 UTC" firstStartedPulling="2025-10-10 08:05:48.388318431 +0000 UTC m=+6095.483476627" lastFinishedPulling="2025-10-10 08:05:55.800956936 +0000 UTC m=+6102.896115132" observedRunningTime="2025-10-10 08:05:57.549257037 +0000 UTC m=+6104.644415243" watchObservedRunningTime="2025-10-10 08:05:57.561840239 +0000 UTC m=+6104.656998445" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.602695 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:05:57 crc kubenswrapper[4822]: W1010 08:05:57.674379 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e77277_2b4e_4b63_8596_ca5185866f05.slice/crio-21275c97b20da20cdec02c27f458e3ee7c37f401255a055afa2c864f918d6c96 WatchSource:0}: Error finding container 21275c97b20da20cdec02c27f458e3ee7c37f401255a055afa2c864f918d6c96: Status 404 returned error can't find the container with id 21275c97b20da20cdec02c27f458e3ee7c37f401255a055afa2c864f918d6c96 Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.675892 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" path="/var/lib/kubelet/pods/84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393/volumes" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.677409 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccc430e-b717-4005-927a-f062c96fb139" path="/var/lib/kubelet/pods/9ccc430e-b717-4005-927a-f062c96fb139/volumes" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.678628 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.678661 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.750011 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:57 crc kubenswrapper[4822]: I1010 08:05:57.752679 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:05:58 crc kubenswrapper[4822]: I1010 08:05:58.436092 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:58 crc kubenswrapper[4822]: I1010 08:05:58.436523 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:05:58 crc kubenswrapper[4822]: I1010 08:05:58.563838 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56faf384-52ff-4e12-ab58-6ced8cf71bc0","Type":"ContainerStarted","Data":"e24105d39c48ca19bcb36d2a384e921114ae42efcc42691d844dd34f08e6cc66"} Oct 10 08:05:58 crc kubenswrapper[4822]: I1010 08:05:58.564438 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56faf384-52ff-4e12-ab58-6ced8cf71bc0","Type":"ContainerStarted","Data":"345d029769286ad1dedd1b341ed6c9edddb20e27b2035c10df4872e1dda60850"} Oct 10 08:05:58 crc kubenswrapper[4822]: I1010 08:05:58.571597 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10e77277-2b4e-4b63-8596-ca5185866f05","Type":"ContainerStarted","Data":"db2faf71f90173de9944edaeee7b70709bb4ac0ca9a8caecafde795d959b644d"} Oct 10 08:05:58 crc kubenswrapper[4822]: I1010 08:05:58.571672 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10e77277-2b4e-4b63-8596-ca5185866f05","Type":"ContainerStarted","Data":"21275c97b20da20cdec02c27f458e3ee7c37f401255a055afa2c864f918d6c96"} Oct 10 08:05:59 crc kubenswrapper[4822]: I1010 08:05:59.589545 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10e77277-2b4e-4b63-8596-ca5185866f05","Type":"ContainerStarted","Data":"c8cceb0a565cd945a0dbe2f7c06546fe62c0a4bc8052f2d34d5bf06fc0304d7f"} Oct 10 08:05:59 crc kubenswrapper[4822]: I1010 08:05:59.595129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56faf384-52ff-4e12-ab58-6ced8cf71bc0","Type":"ContainerStarted","Data":"308bceb7e9c54296a2d47d8b68d1f3f0fa0e76cbbc7deb9b5d53a8a8fc8ce387"} Oct 10 08:05:59 crc kubenswrapper[4822]: I1010 08:05:59.630136 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.630113705 podStartE2EDuration="3.630113705s" podCreationTimestamp="2025-10-10 08:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:05:59.613399443 +0000 UTC m=+6106.708557679" watchObservedRunningTime="2025-10-10 08:05:59.630113705 +0000 UTC m=+6106.725271901" Oct 10 08:05:59 crc kubenswrapper[4822]: I1010 08:05:59.649028 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.648987209 podStartE2EDuration="3.648987209s" podCreationTimestamp="2025-10-10 08:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:05:59.646123657 +0000 UTC m=+6106.741281873" watchObservedRunningTime="2025-10-10 08:05:59.648987209 +0000 UTC m=+6106.744145405" Oct 10 08:06:06 crc kubenswrapper[4822]: I1010 08:06:06.914642 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 08:06:06 crc kubenswrapper[4822]: I1010 08:06:06.915507 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 08:06:06 crc kubenswrapper[4822]: I1010 08:06:06.947219 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 08:06:06 crc kubenswrapper[4822]: I1010 08:06:06.959677 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:06 crc kubenswrapper[4822]: I1010 08:06:06.959748 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:06 crc kubenswrapper[4822]: I1010 08:06:06.965577 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.009482 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.029036 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.699706 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.700371 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.703001 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.703503 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 08:06:07 crc kubenswrapper[4822]: I1010 08:06:07.752039 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-848c979cff-kpxhq" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 10 08:06:08 crc kubenswrapper[4822]: I1010 08:06:08.438168 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bc8479b8f-dzrp9" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 10 08:06:09 crc kubenswrapper[4822]: I1010 08:06:09.717738 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 08:06:09 crc kubenswrapper[4822]: I1010 08:06:09.718216 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 08:06:10 crc kubenswrapper[4822]: I1010 08:06:10.033653 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:10 crc kubenswrapper[4822]: I1010 08:06:10.100841 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 08:06:10 crc kubenswrapper[4822]: I1010 08:06:10.100976 4822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 08:06:10 crc kubenswrapper[4822]: I1010 08:06:10.104550 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 08:06:10 crc kubenswrapper[4822]: I1010 08:06:10.210785 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 08:06:18 crc kubenswrapper[4822]: I1010 08:06:18.039768 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-qqv8w"] Oct 10 08:06:18 crc kubenswrapper[4822]: I1010 08:06:18.048670 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-qqv8w"] Oct 10 08:06:19 crc kubenswrapper[4822]: I1010 08:06:19.570798 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:06:19 crc kubenswrapper[4822]: I1010 08:06:19.664718 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c56f1f0-0da4-4966-a2ce-2705737f9764" path="/var/lib/kubelet/pods/4c56f1f0-0da4-4966-a2ce-2705737f9764/volumes" Oct 10 08:06:20 crc kubenswrapper[4822]: I1010 08:06:20.250496 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:06:21 crc kubenswrapper[4822]: I1010 08:06:21.472340 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:06:21 crc kubenswrapper[4822]: I1010 08:06:21.816632 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.46:9292/healthcheck\": dial tcp 10.217.1.46:9292: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 10 08:06:21 crc kubenswrapper[4822]: I1010 08:06:21.818556 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="9ccc430e-b717-4005-927a-f062c96fb139" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.46:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:06:22 crc kubenswrapper[4822]: I1010 08:06:22.106819 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:06:22 crc kubenswrapper[4822]: I1010 08:06:22.186705 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-848c979cff-kpxhq"] Oct 10 08:06:22 crc kubenswrapper[4822]: I1010 08:06:22.186993 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-848c979cff-kpxhq" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon-log" containerID="cri-o://683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826" gracePeriod=30 Oct 10 08:06:22 crc kubenswrapper[4822]: I1010 08:06:22.187098 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-848c979cff-kpxhq" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" containerID="cri-o://bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229" gracePeriod=30 Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.043823 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.47:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.043953 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="84dbf6b3-4cfd-4ebf-946f-3a9dbd1fa393" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.47:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.143609 4822 scope.go:117] "RemoveContainer" containerID="611c7c6d95a4c4aa00307343c9f6f6a5790a7d270b909e6a491738b09b67dd19" Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.194879 4822 scope.go:117] "RemoveContainer" containerID="93f719af86eb5f09d2e201096e3c52765c986af3cb8b7a9d681e89e7191d0f97" Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.234078 4822 scope.go:117] "RemoveContainer" containerID="6df5236d20a2db9167c94b9bddbc2a92a1d378056540d84ac3a566ac807d7021" Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.281545 4822 scope.go:117] "RemoveContainer" containerID="849eda3a0cf6cc49ba047cee2794ae178512815607fdd52335a08afb0af633b3" Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.908212 4822 generic.go:334] "Generic (PLEG): container finished" podID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerID="bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229" exitCode=0 Oct 10 08:06:25 crc kubenswrapper[4822]: I1010 08:06:25.908285 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-848c979cff-kpxhq" event={"ID":"8ada923a-cb80-427a-94c5-b2a1fe4f2b15","Type":"ContainerDied","Data":"bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229"} Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.750444 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-848c979cff-kpxhq" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 10 08:06:27 crc kubenswrapper[4822]: E1010 08:06:27.886252 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48847aa7_6a8b_4d04_a427_edc9aa21743b.slice/crio-bd69a984a9ed5778c5cf62ad5f6b0532eb1330edeb59889f3fbfefcb7a785a76.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.952243 4822 generic.go:334] "Generic (PLEG): container finished" podID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerID="bd69a984a9ed5778c5cf62ad5f6b0532eb1330edeb59889f3fbfefcb7a785a76" exitCode=137 Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.952277 4822 generic.go:334] "Generic (PLEG): container finished" podID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerID="aaae650c9d44f7b16749f5cab3702d985b0db1b9bda4f6850439de4631735b08" exitCode=137 Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.952302 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbddcd9df-jpclr" event={"ID":"48847aa7-6a8b-4d04-a427-edc9aa21743b","Type":"ContainerDied","Data":"bd69a984a9ed5778c5cf62ad5f6b0532eb1330edeb59889f3fbfefcb7a785a76"} Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.952330 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbddcd9df-jpclr" event={"ID":"48847aa7-6a8b-4d04-a427-edc9aa21743b","Type":"ContainerDied","Data":"aaae650c9d44f7b16749f5cab3702d985b0db1b9bda4f6850439de4631735b08"} Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.952340 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbddcd9df-jpclr" event={"ID":"48847aa7-6a8b-4d04-a427-edc9aa21743b","Type":"ContainerDied","Data":"3e7b09db6aa75edc69ac5136203bb1cd330b708f3ac4e401e800588da6579b02"} Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.952351 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e7b09db6aa75edc69ac5136203bb1cd330b708f3ac4e401e800588da6579b02" Oct 10 08:06:27 crc kubenswrapper[4822]: I1010 08:06:27.978753 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.043751 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a241-account-create-dzbjp"] Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057161 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48847aa7-6a8b-4d04-a427-edc9aa21743b-logs\") pod \"48847aa7-6a8b-4d04-a427-edc9aa21743b\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057228 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48847aa7-6a8b-4d04-a427-edc9aa21743b-horizon-secret-key\") pod \"48847aa7-6a8b-4d04-a427-edc9aa21743b\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057266 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-config-data\") pod \"48847aa7-6a8b-4d04-a427-edc9aa21743b\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057315 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-scripts\") pod \"48847aa7-6a8b-4d04-a427-edc9aa21743b\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057331 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg587\" (UniqueName: \"kubernetes.io/projected/48847aa7-6a8b-4d04-a427-edc9aa21743b-kube-api-access-rg587\") pod \"48847aa7-6a8b-4d04-a427-edc9aa21743b\" (UID: \"48847aa7-6a8b-4d04-a427-edc9aa21743b\") " Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057567 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48847aa7-6a8b-4d04-a427-edc9aa21743b-logs" (OuterVolumeSpecName: "logs") pod "48847aa7-6a8b-4d04-a427-edc9aa21743b" (UID: "48847aa7-6a8b-4d04-a427-edc9aa21743b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.057705 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48847aa7-6a8b-4d04-a427-edc9aa21743b-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.058595 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a241-account-create-dzbjp"] Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.069998 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48847aa7-6a8b-4d04-a427-edc9aa21743b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "48847aa7-6a8b-4d04-a427-edc9aa21743b" (UID: "48847aa7-6a8b-4d04-a427-edc9aa21743b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.081679 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48847aa7-6a8b-4d04-a427-edc9aa21743b-kube-api-access-rg587" (OuterVolumeSpecName: "kube-api-access-rg587") pod "48847aa7-6a8b-4d04-a427-edc9aa21743b" (UID: "48847aa7-6a8b-4d04-a427-edc9aa21743b"). InnerVolumeSpecName "kube-api-access-rg587". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.091118 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-config-data" (OuterVolumeSpecName: "config-data") pod "48847aa7-6a8b-4d04-a427-edc9aa21743b" (UID: "48847aa7-6a8b-4d04-a427-edc9aa21743b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.092123 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-scripts" (OuterVolumeSpecName: "scripts") pod "48847aa7-6a8b-4d04-a427-edc9aa21743b" (UID: "48847aa7-6a8b-4d04-a427-edc9aa21743b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.159819 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.159864 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg587\" (UniqueName: \"kubernetes.io/projected/48847aa7-6a8b-4d04-a427-edc9aa21743b-kube-api-access-rg587\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.159881 4822 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/48847aa7-6a8b-4d04-a427-edc9aa21743b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.159891 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48847aa7-6a8b-4d04-a427-edc9aa21743b-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.960962 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbddcd9df-jpclr" Oct 10 08:06:28 crc kubenswrapper[4822]: I1010 08:06:28.998343 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbddcd9df-jpclr"] Oct 10 08:06:29 crc kubenswrapper[4822]: I1010 08:06:29.006385 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fbddcd9df-jpclr"] Oct 10 08:06:29 crc kubenswrapper[4822]: I1010 08:06:29.665196 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6ec260-52e1-456e-aac9-19c02953a0e6" path="/var/lib/kubelet/pods/3e6ec260-52e1-456e-aac9-19c02953a0e6/volumes" Oct 10 08:06:29 crc kubenswrapper[4822]: I1010 08:06:29.668621 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" path="/var/lib/kubelet/pods/48847aa7-6a8b-4d04-a427-edc9aa21743b/volumes" Oct 10 08:06:34 crc kubenswrapper[4822]: I1010 08:06:34.054573 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rxxnx"] Oct 10 08:06:34 crc kubenswrapper[4822]: I1010 08:06:34.067190 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rxxnx"] Oct 10 08:06:35 crc kubenswrapper[4822]: I1010 08:06:35.671374 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab26663-d9d5-425f-9eee-36df23b8ce23" path="/var/lib/kubelet/pods/3ab26663-d9d5-425f-9eee-36df23b8ce23/volumes" Oct 10 08:06:37 crc kubenswrapper[4822]: I1010 08:06:37.750778 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-848c979cff-kpxhq" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 10 08:06:47 crc kubenswrapper[4822]: I1010 08:06:47.751223 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-848c979cff-kpxhq" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 10 08:06:47 crc kubenswrapper[4822]: I1010 08:06:47.752005 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.680003 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.834772 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc7m9\" (UniqueName: \"kubernetes.io/projected/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-kube-api-access-wc7m9\") pod \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.834931 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-scripts\") pod \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.835041 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-logs\") pod \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.835082 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-horizon-secret-key\") pod \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.835130 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-config-data\") pod \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\" (UID: \"8ada923a-cb80-427a-94c5-b2a1fe4f2b15\") " Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.836129 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-logs" (OuterVolumeSpecName: "logs") pod "8ada923a-cb80-427a-94c5-b2a1fe4f2b15" (UID: "8ada923a-cb80-427a-94c5-b2a1fe4f2b15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.843040 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8ada923a-cb80-427a-94c5-b2a1fe4f2b15" (UID: "8ada923a-cb80-427a-94c5-b2a1fe4f2b15"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.843316 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-kube-api-access-wc7m9" (OuterVolumeSpecName: "kube-api-access-wc7m9") pod "8ada923a-cb80-427a-94c5-b2a1fe4f2b15" (UID: "8ada923a-cb80-427a-94c5-b2a1fe4f2b15"). InnerVolumeSpecName "kube-api-access-wc7m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.865075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-scripts" (OuterVolumeSpecName: "scripts") pod "8ada923a-cb80-427a-94c5-b2a1fe4f2b15" (UID: "8ada923a-cb80-427a-94c5-b2a1fe4f2b15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.867070 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-config-data" (OuterVolumeSpecName: "config-data") pod "8ada923a-cb80-427a-94c5-b2a1fe4f2b15" (UID: "8ada923a-cb80-427a-94c5-b2a1fe4f2b15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.937616 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.937653 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.937666 4822 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.937677 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:52 crc kubenswrapper[4822]: I1010 08:06:52.937686 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc7m9\" (UniqueName: \"kubernetes.io/projected/8ada923a-cb80-427a-94c5-b2a1fe4f2b15-kube-api-access-wc7m9\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.225032 4822 generic.go:334] "Generic (PLEG): container finished" podID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerID="683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826" exitCode=137 Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.225085 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-848c979cff-kpxhq" event={"ID":"8ada923a-cb80-427a-94c5-b2a1fe4f2b15","Type":"ContainerDied","Data":"683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826"} Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.225111 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-848c979cff-kpxhq" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.225381 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-848c979cff-kpxhq" event={"ID":"8ada923a-cb80-427a-94c5-b2a1fe4f2b15","Type":"ContainerDied","Data":"b6a9e13a68de73bee3c4f14ebe47389a3e351416e62bcdd37396c4973932d41b"} Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.225405 4822 scope.go:117] "RemoveContainer" containerID="bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.267432 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-848c979cff-kpxhq"] Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.280662 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-848c979cff-kpxhq"] Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.412885 4822 scope.go:117] "RemoveContainer" containerID="683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.432907 4822 scope.go:117] "RemoveContainer" containerID="bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229" Oct 10 08:06:53 crc kubenswrapper[4822]: E1010 08:06:53.433873 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229\": container with ID starting with bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229 not found: ID does not exist" containerID="bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.433940 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229"} err="failed to get container status \"bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229\": rpc error: code = NotFound desc = could not find container \"bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229\": container with ID starting with bb1fe9f8e730d56dac898964df35c2d9725bddc3933d1616d6b180c907560229 not found: ID does not exist" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.433980 4822 scope.go:117] "RemoveContainer" containerID="683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826" Oct 10 08:06:53 crc kubenswrapper[4822]: E1010 08:06:53.434613 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826\": container with ID starting with 683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826 not found: ID does not exist" containerID="683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.434789 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826"} err="failed to get container status \"683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826\": rpc error: code = NotFound desc = could not find container \"683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826\": container with ID starting with 683f954e345901fc456ac497fa13eaaca02b81127b7c0dc1005e60b50c361826 not found: ID does not exist" Oct 10 08:06:53 crc kubenswrapper[4822]: I1010 08:06:53.667112 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" path="/var/lib/kubelet/pods/8ada923a-cb80-427a-94c5-b2a1fe4f2b15/volumes" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.790177 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9l5xk"] Oct 10 08:06:54 crc kubenswrapper[4822]: E1010 08:06:54.790868 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon-log" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.790889 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon-log" Oct 10 08:06:54 crc kubenswrapper[4822]: E1010 08:06:54.790917 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.790929 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" Oct 10 08:06:54 crc kubenswrapper[4822]: E1010 08:06:54.790947 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon-log" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.790958 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon-log" Oct 10 08:06:54 crc kubenswrapper[4822]: E1010 08:06:54.790984 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.790995 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.791315 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon-log" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.791346 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ada923a-cb80-427a-94c5-b2a1fe4f2b15" containerName="horizon" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.791386 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.791403 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="48847aa7-6a8b-4d04-a427-edc9aa21743b" containerName="horizon-log" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.794042 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.810189 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9l5xk"] Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.882677 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-utilities\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.882751 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz857\" (UniqueName: \"kubernetes.io/projected/c37ea3d2-8851-4859-97ad-1c49bb382378-kube-api-access-cz857\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.882906 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-catalog-content\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.985089 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-utilities\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.985162 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz857\" (UniqueName: \"kubernetes.io/projected/c37ea3d2-8851-4859-97ad-1c49bb382378-kube-api-access-cz857\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.985259 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-catalog-content\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.985716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-utilities\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:54 crc kubenswrapper[4822]: I1010 08:06:54.985849 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-catalog-content\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:55 crc kubenswrapper[4822]: I1010 08:06:55.006687 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz857\" (UniqueName: \"kubernetes.io/projected/c37ea3d2-8851-4859-97ad-1c49bb382378-kube-api-access-cz857\") pod \"redhat-operators-9l5xk\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:55 crc kubenswrapper[4822]: I1010 08:06:55.171247 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:06:55 crc kubenswrapper[4822]: I1010 08:06:55.644286 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9l5xk"] Oct 10 08:06:55 crc kubenswrapper[4822]: W1010 08:06:55.649682 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc37ea3d2_8851_4859_97ad_1c49bb382378.slice/crio-0d8d89cc0ceb4aaa45b6f7fd3f88655f2ba6b5e440dd666f96867a69fc5e1437 WatchSource:0}: Error finding container 0d8d89cc0ceb4aaa45b6f7fd3f88655f2ba6b5e440dd666f96867a69fc5e1437: Status 404 returned error can't find the container with id 0d8d89cc0ceb4aaa45b6f7fd3f88655f2ba6b5e440dd666f96867a69fc5e1437 Oct 10 08:06:56 crc kubenswrapper[4822]: I1010 08:06:56.259326 4822 generic.go:334] "Generic (PLEG): container finished" podID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerID="610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874" exitCode=0 Oct 10 08:06:56 crc kubenswrapper[4822]: I1010 08:06:56.259425 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l5xk" event={"ID":"c37ea3d2-8851-4859-97ad-1c49bb382378","Type":"ContainerDied","Data":"610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874"} Oct 10 08:06:56 crc kubenswrapper[4822]: I1010 08:06:56.259853 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l5xk" event={"ID":"c37ea3d2-8851-4859-97ad-1c49bb382378","Type":"ContainerStarted","Data":"0d8d89cc0ceb4aaa45b6f7fd3f88655f2ba6b5e440dd666f96867a69fc5e1437"} Oct 10 08:06:58 crc kubenswrapper[4822]: I1010 08:06:58.295709 4822 generic.go:334] "Generic (PLEG): container finished" podID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerID="4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31" exitCode=0 Oct 10 08:06:58 crc kubenswrapper[4822]: I1010 08:06:58.295773 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l5xk" event={"ID":"c37ea3d2-8851-4859-97ad-1c49bb382378","Type":"ContainerDied","Data":"4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31"} Oct 10 08:06:59 crc kubenswrapper[4822]: I1010 08:06:59.310859 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l5xk" event={"ID":"c37ea3d2-8851-4859-97ad-1c49bb382378","Type":"ContainerStarted","Data":"00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22"} Oct 10 08:06:59 crc kubenswrapper[4822]: I1010 08:06:59.367056 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9l5xk" podStartSLOduration=2.7930232950000002 podStartE2EDuration="5.367035899s" podCreationTimestamp="2025-10-10 08:06:54 +0000 UTC" firstStartedPulling="2025-10-10 08:06:56.260966697 +0000 UTC m=+6163.356124913" lastFinishedPulling="2025-10-10 08:06:58.834979281 +0000 UTC m=+6165.930137517" observedRunningTime="2025-10-10 08:06:59.357419712 +0000 UTC m=+6166.452577908" watchObservedRunningTime="2025-10-10 08:06:59.367035899 +0000 UTC m=+6166.462194095" Oct 10 08:07:01 crc kubenswrapper[4822]: I1010 08:07:01.337395 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:07:01 crc kubenswrapper[4822]: I1010 08:07:01.337998 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.932857 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54c5d44449-hr4bs"] Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.935836 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.938142 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5228af92-c9c7-494f-957f-0e63f41ca0eb-config-data\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.938193 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5228af92-c9c7-494f-957f-0e63f41ca0eb-horizon-secret-key\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.938297 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdkxr\" (UniqueName: \"kubernetes.io/projected/5228af92-c9c7-494f-957f-0e63f41ca0eb-kube-api-access-kdkxr\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.938379 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5228af92-c9c7-494f-957f-0e63f41ca0eb-scripts\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.938401 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5228af92-c9c7-494f-957f-0e63f41ca0eb-logs\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:04 crc kubenswrapper[4822]: I1010 08:07:04.947914 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54c5d44449-hr4bs"] Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.040758 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdkxr\" (UniqueName: \"kubernetes.io/projected/5228af92-c9c7-494f-957f-0e63f41ca0eb-kube-api-access-kdkxr\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.040957 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5228af92-c9c7-494f-957f-0e63f41ca0eb-scripts\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.040990 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5228af92-c9c7-494f-957f-0e63f41ca0eb-logs\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.041154 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5228af92-c9c7-494f-957f-0e63f41ca0eb-config-data\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.041200 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5228af92-c9c7-494f-957f-0e63f41ca0eb-horizon-secret-key\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.041848 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5228af92-c9c7-494f-957f-0e63f41ca0eb-logs\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.044160 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5228af92-c9c7-494f-957f-0e63f41ca0eb-scripts\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.046086 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5228af92-c9c7-494f-957f-0e63f41ca0eb-config-data\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.052242 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5228af92-c9c7-494f-957f-0e63f41ca0eb-horizon-secret-key\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.062417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdkxr\" (UniqueName: \"kubernetes.io/projected/5228af92-c9c7-494f-957f-0e63f41ca0eb-kube-api-access-kdkxr\") pod \"horizon-54c5d44449-hr4bs\" (UID: \"5228af92-c9c7-494f-957f-0e63f41ca0eb\") " pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.172139 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.172190 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.225607 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.270944 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.438762 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:07:05 crc kubenswrapper[4822]: I1010 08:07:05.504948 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9l5xk"] Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:05.843394 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54c5d44449-hr4bs"] Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.373854 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-6m794"] Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.378593 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6m794" Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.380521 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-6m794"] Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.390321 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c5d44449-hr4bs" event={"ID":"5228af92-c9c7-494f-957f-0e63f41ca0eb","Type":"ContainerStarted","Data":"c3d3efb3c23f0e0227341dc9cec12ace87c7603b53546a16a93ba1aa5deb06db"} Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.390367 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c5d44449-hr4bs" event={"ID":"5228af92-c9c7-494f-957f-0e63f41ca0eb","Type":"ContainerStarted","Data":"16681a9d23883962f495934ac24ceff31308e6750dc15951e9ea1b6e14fa07d7"} Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.480015 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xhn\" (UniqueName: \"kubernetes.io/projected/e836530f-40ee-424b-9b14-3456252a1b43-kube-api-access-z7xhn\") pod \"heat-db-create-6m794\" (UID: \"e836530f-40ee-424b-9b14-3456252a1b43\") " pod="openstack/heat-db-create-6m794" Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.582080 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xhn\" (UniqueName: \"kubernetes.io/projected/e836530f-40ee-424b-9b14-3456252a1b43-kube-api-access-z7xhn\") pod \"heat-db-create-6m794\" (UID: \"e836530f-40ee-424b-9b14-3456252a1b43\") " pod="openstack/heat-db-create-6m794" Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.607181 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xhn\" (UniqueName: \"kubernetes.io/projected/e836530f-40ee-424b-9b14-3456252a1b43-kube-api-access-z7xhn\") pod \"heat-db-create-6m794\" (UID: \"e836530f-40ee-424b-9b14-3456252a1b43\") " pod="openstack/heat-db-create-6m794" Oct 10 08:07:06 crc kubenswrapper[4822]: I1010 08:07:06.720471 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6m794" Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.185271 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-6m794"] Oct 10 08:07:07 crc kubenswrapper[4822]: W1010 08:07:07.190614 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode836530f_40ee_424b_9b14_3456252a1b43.slice/crio-ce09d399e42727f2f52048aeffe20148b2c27f19c2137c030ded533a3f3c0f3c WatchSource:0}: Error finding container ce09d399e42727f2f52048aeffe20148b2c27f19c2137c030ded533a3f3c0f3c: Status 404 returned error can't find the container with id ce09d399e42727f2f52048aeffe20148b2c27f19c2137c030ded533a3f3c0f3c Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.402780 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6m794" event={"ID":"e836530f-40ee-424b-9b14-3456252a1b43","Type":"ContainerStarted","Data":"ce09d399e42727f2f52048aeffe20148b2c27f19c2137c030ded533a3f3c0f3c"} Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.405849 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c5d44449-hr4bs" event={"ID":"5228af92-c9c7-494f-957f-0e63f41ca0eb","Type":"ContainerStarted","Data":"9858697bdff1bf3ffa63b7a93fd049331e8f9d8a5d5c958439f6402a4b638edd"} Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.406005 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9l5xk" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="registry-server" containerID="cri-o://00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22" gracePeriod=2 Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.436412 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54c5d44449-hr4bs" podStartSLOduration=3.436390988 podStartE2EDuration="3.436390988s" podCreationTimestamp="2025-10-10 08:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:07:07.435524403 +0000 UTC m=+6174.530682599" watchObservedRunningTime="2025-10-10 08:07:07.436390988 +0000 UTC m=+6174.531549194" Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.902365 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.917252 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-utilities\") pod \"c37ea3d2-8851-4859-97ad-1c49bb382378\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.917475 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-catalog-content\") pod \"c37ea3d2-8851-4859-97ad-1c49bb382378\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.917502 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz857\" (UniqueName: \"kubernetes.io/projected/c37ea3d2-8851-4859-97ad-1c49bb382378-kube-api-access-cz857\") pod \"c37ea3d2-8851-4859-97ad-1c49bb382378\" (UID: \"c37ea3d2-8851-4859-97ad-1c49bb382378\") " Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.918063 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-utilities" (OuterVolumeSpecName: "utilities") pod "c37ea3d2-8851-4859-97ad-1c49bb382378" (UID: "c37ea3d2-8851-4859-97ad-1c49bb382378"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.918224 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:07 crc kubenswrapper[4822]: I1010 08:07:07.927076 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37ea3d2-8851-4859-97ad-1c49bb382378-kube-api-access-cz857" (OuterVolumeSpecName: "kube-api-access-cz857") pod "c37ea3d2-8851-4859-97ad-1c49bb382378" (UID: "c37ea3d2-8851-4859-97ad-1c49bb382378"). InnerVolumeSpecName "kube-api-access-cz857". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.004242 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c37ea3d2-8851-4859-97ad-1c49bb382378" (UID: "c37ea3d2-8851-4859-97ad-1c49bb382378"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.019897 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz857\" (UniqueName: \"kubernetes.io/projected/c37ea3d2-8851-4859-97ad-1c49bb382378-kube-api-access-cz857\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.020175 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37ea3d2-8851-4859-97ad-1c49bb382378-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.416083 4822 generic.go:334] "Generic (PLEG): container finished" podID="e836530f-40ee-424b-9b14-3456252a1b43" containerID="bac652866ea95e6d1fb45710a3ab56125adec615af37ed526cf30506df8f6966" exitCode=0 Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.416435 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6m794" event={"ID":"e836530f-40ee-424b-9b14-3456252a1b43","Type":"ContainerDied","Data":"bac652866ea95e6d1fb45710a3ab56125adec615af37ed526cf30506df8f6966"} Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.419454 4822 generic.go:334] "Generic (PLEG): container finished" podID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerID="00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22" exitCode=0 Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.420446 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l5xk" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.422916 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l5xk" event={"ID":"c37ea3d2-8851-4859-97ad-1c49bb382378","Type":"ContainerDied","Data":"00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22"} Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.422998 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l5xk" event={"ID":"c37ea3d2-8851-4859-97ad-1c49bb382378","Type":"ContainerDied","Data":"0d8d89cc0ceb4aaa45b6f7fd3f88655f2ba6b5e440dd666f96867a69fc5e1437"} Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.423034 4822 scope.go:117] "RemoveContainer" containerID="00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.497356 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9l5xk"] Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.509087 4822 scope.go:117] "RemoveContainer" containerID="4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.517519 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9l5xk"] Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.546317 4822 scope.go:117] "RemoveContainer" containerID="610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.594001 4822 scope.go:117] "RemoveContainer" containerID="00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22" Oct 10 08:07:08 crc kubenswrapper[4822]: E1010 08:07:08.598364 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22\": container with ID starting with 00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22 not found: ID does not exist" containerID="00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.598406 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22"} err="failed to get container status \"00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22\": rpc error: code = NotFound desc = could not find container \"00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22\": container with ID starting with 00177b7217999dba05d7f5bf81c4406ae4e974ee632f7aeb8890fa9bbf3d7d22 not found: ID does not exist" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.598434 4822 scope.go:117] "RemoveContainer" containerID="4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31" Oct 10 08:07:08 crc kubenswrapper[4822]: E1010 08:07:08.598796 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31\": container with ID starting with 4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31 not found: ID does not exist" containerID="4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.598832 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31"} err="failed to get container status \"4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31\": rpc error: code = NotFound desc = could not find container \"4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31\": container with ID starting with 4e90c676d306d4e8e8945435818fb471092fb0e775e06d388dd197afe35a0f31 not found: ID does not exist" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.598844 4822 scope.go:117] "RemoveContainer" containerID="610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874" Oct 10 08:07:08 crc kubenswrapper[4822]: E1010 08:07:08.599047 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874\": container with ID starting with 610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874 not found: ID does not exist" containerID="610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874" Oct 10 08:07:08 crc kubenswrapper[4822]: I1010 08:07:08.599067 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874"} err="failed to get container status \"610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874\": rpc error: code = NotFound desc = could not find container \"610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874\": container with ID starting with 610b8a449f0e03eabcedf8f3e998cc652b2e1332ba008b21b7ec9222076bb874 not found: ID does not exist" Oct 10 08:07:09 crc kubenswrapper[4822]: I1010 08:07:09.701396 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" path="/var/lib/kubelet/pods/c37ea3d2-8851-4859-97ad-1c49bb382378/volumes" Oct 10 08:07:09 crc kubenswrapper[4822]: I1010 08:07:09.878904 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6m794" Oct 10 08:07:10 crc kubenswrapper[4822]: I1010 08:07:10.066675 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7xhn\" (UniqueName: \"kubernetes.io/projected/e836530f-40ee-424b-9b14-3456252a1b43-kube-api-access-z7xhn\") pod \"e836530f-40ee-424b-9b14-3456252a1b43\" (UID: \"e836530f-40ee-424b-9b14-3456252a1b43\") " Oct 10 08:07:10 crc kubenswrapper[4822]: I1010 08:07:10.082105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e836530f-40ee-424b-9b14-3456252a1b43-kube-api-access-z7xhn" (OuterVolumeSpecName: "kube-api-access-z7xhn") pod "e836530f-40ee-424b-9b14-3456252a1b43" (UID: "e836530f-40ee-424b-9b14-3456252a1b43"). InnerVolumeSpecName "kube-api-access-z7xhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:07:10 crc kubenswrapper[4822]: I1010 08:07:10.170001 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7xhn\" (UniqueName: \"kubernetes.io/projected/e836530f-40ee-424b-9b14-3456252a1b43-kube-api-access-z7xhn\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:10 crc kubenswrapper[4822]: I1010 08:07:10.442118 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6m794" event={"ID":"e836530f-40ee-424b-9b14-3456252a1b43","Type":"ContainerDied","Data":"ce09d399e42727f2f52048aeffe20148b2c27f19c2137c030ded533a3f3c0f3c"} Oct 10 08:07:10 crc kubenswrapper[4822]: I1010 08:07:10.442160 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6m794" Oct 10 08:07:10 crc kubenswrapper[4822]: I1010 08:07:10.442166 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce09d399e42727f2f52048aeffe20148b2c27f19c2137c030ded533a3f3c0f3c" Oct 10 08:07:15 crc kubenswrapper[4822]: I1010 08:07:15.272666 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:15 crc kubenswrapper[4822]: I1010 08:07:15.273531 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.478201 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4e57-account-create-pvtw5"] Oct 10 08:07:16 crc kubenswrapper[4822]: E1010 08:07:16.479252 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="registry-server" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.479273 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="registry-server" Oct 10 08:07:16 crc kubenswrapper[4822]: E1010 08:07:16.479287 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="extract-content" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.479294 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="extract-content" Oct 10 08:07:16 crc kubenswrapper[4822]: E1010 08:07:16.479327 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="extract-utilities" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.479336 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="extract-utilities" Oct 10 08:07:16 crc kubenswrapper[4822]: E1010 08:07:16.479350 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e836530f-40ee-424b-9b14-3456252a1b43" containerName="mariadb-database-create" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.479357 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e836530f-40ee-424b-9b14-3456252a1b43" containerName="mariadb-database-create" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.482787 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e836530f-40ee-424b-9b14-3456252a1b43" containerName="mariadb-database-create" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.482946 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c37ea3d2-8851-4859-97ad-1c49bb382378" containerName="registry-server" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.508210 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.529639 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.548414 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4e57-account-create-pvtw5"] Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.633795 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9zrb\" (UniqueName: \"kubernetes.io/projected/0e5de391-05fb-4ce3-9c38-c48a25e28b88-kube-api-access-n9zrb\") pod \"heat-4e57-account-create-pvtw5\" (UID: \"0e5de391-05fb-4ce3-9c38-c48a25e28b88\") " pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.735687 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9zrb\" (UniqueName: \"kubernetes.io/projected/0e5de391-05fb-4ce3-9c38-c48a25e28b88-kube-api-access-n9zrb\") pod \"heat-4e57-account-create-pvtw5\" (UID: \"0e5de391-05fb-4ce3-9c38-c48a25e28b88\") " pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.762303 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9zrb\" (UniqueName: \"kubernetes.io/projected/0e5de391-05fb-4ce3-9c38-c48a25e28b88-kube-api-access-n9zrb\") pod \"heat-4e57-account-create-pvtw5\" (UID: \"0e5de391-05fb-4ce3-9c38-c48a25e28b88\") " pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:16 crc kubenswrapper[4822]: I1010 08:07:16.845095 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:17 crc kubenswrapper[4822]: I1010 08:07:17.326424 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4e57-account-create-pvtw5"] Oct 10 08:07:17 crc kubenswrapper[4822]: W1010 08:07:17.332948 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e5de391_05fb_4ce3_9c38_c48a25e28b88.slice/crio-a8d2966e065202a7c4c753ccc2512fc8857df47c015021db76e61cc226c783af WatchSource:0}: Error finding container a8d2966e065202a7c4c753ccc2512fc8857df47c015021db76e61cc226c783af: Status 404 returned error can't find the container with id a8d2966e065202a7c4c753ccc2512fc8857df47c015021db76e61cc226c783af Oct 10 08:07:17 crc kubenswrapper[4822]: I1010 08:07:17.551938 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4e57-account-create-pvtw5" event={"ID":"0e5de391-05fb-4ce3-9c38-c48a25e28b88","Type":"ContainerStarted","Data":"a8d2966e065202a7c4c753ccc2512fc8857df47c015021db76e61cc226c783af"} Oct 10 08:07:18 crc kubenswrapper[4822]: I1010 08:07:18.564818 4822 generic.go:334] "Generic (PLEG): container finished" podID="0e5de391-05fb-4ce3-9c38-c48a25e28b88" containerID="895fd2984f54460d1a23d4e51b2ea31e64165ba74401499f2ca82fe5fc664a8b" exitCode=0 Oct 10 08:07:18 crc kubenswrapper[4822]: I1010 08:07:18.565044 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4e57-account-create-pvtw5" event={"ID":"0e5de391-05fb-4ce3-9c38-c48a25e28b88","Type":"ContainerDied","Data":"895fd2984f54460d1a23d4e51b2ea31e64165ba74401499f2ca82fe5fc664a8b"} Oct 10 08:07:19 crc kubenswrapper[4822]: I1010 08:07:19.926943 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:20 crc kubenswrapper[4822]: I1010 08:07:20.026762 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9zrb\" (UniqueName: \"kubernetes.io/projected/0e5de391-05fb-4ce3-9c38-c48a25e28b88-kube-api-access-n9zrb\") pod \"0e5de391-05fb-4ce3-9c38-c48a25e28b88\" (UID: \"0e5de391-05fb-4ce3-9c38-c48a25e28b88\") " Oct 10 08:07:20 crc kubenswrapper[4822]: I1010 08:07:20.036563 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5de391-05fb-4ce3-9c38-c48a25e28b88-kube-api-access-n9zrb" (OuterVolumeSpecName: "kube-api-access-n9zrb") pod "0e5de391-05fb-4ce3-9c38-c48a25e28b88" (UID: "0e5de391-05fb-4ce3-9c38-c48a25e28b88"). InnerVolumeSpecName "kube-api-access-n9zrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:07:20 crc kubenswrapper[4822]: I1010 08:07:20.138811 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9zrb\" (UniqueName: \"kubernetes.io/projected/0e5de391-05fb-4ce3-9c38-c48a25e28b88-kube-api-access-n9zrb\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:20 crc kubenswrapper[4822]: I1010 08:07:20.589292 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4e57-account-create-pvtw5" event={"ID":"0e5de391-05fb-4ce3-9c38-c48a25e28b88","Type":"ContainerDied","Data":"a8d2966e065202a7c4c753ccc2512fc8857df47c015021db76e61cc226c783af"} Oct 10 08:07:20 crc kubenswrapper[4822]: I1010 08:07:20.589707 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d2966e065202a7c4c753ccc2512fc8857df47c015021db76e61cc226c783af" Oct 10 08:07:20 crc kubenswrapper[4822]: I1010 08:07:20.589421 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4e57-account-create-pvtw5" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.539283 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-pp6p9"] Oct 10 08:07:21 crc kubenswrapper[4822]: E1010 08:07:21.541074 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5de391-05fb-4ce3-9c38-c48a25e28b88" containerName="mariadb-account-create" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.541202 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5de391-05fb-4ce3-9c38-c48a25e28b88" containerName="mariadb-account-create" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.541488 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5de391-05fb-4ce3-9c38-c48a25e28b88" containerName="mariadb-account-create" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.542477 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.552152 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.552231 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-69n2h" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.556492 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-pp6p9"] Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.681923 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-config-data\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.682030 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-combined-ca-bundle\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.682088 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqnm\" (UniqueName: \"kubernetes.io/projected/7a701b49-d6e2-47dc-a329-9b88b163c568-kube-api-access-cxqnm\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.784475 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqnm\" (UniqueName: \"kubernetes.io/projected/7a701b49-d6e2-47dc-a329-9b88b163c568-kube-api-access-cxqnm\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.785216 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-config-data\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.785265 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-combined-ca-bundle\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.792863 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-combined-ca-bundle\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.794010 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-config-data\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.807309 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqnm\" (UniqueName: \"kubernetes.io/projected/7a701b49-d6e2-47dc-a329-9b88b163c568-kube-api-access-cxqnm\") pod \"heat-db-sync-pp6p9\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:21 crc kubenswrapper[4822]: I1010 08:07:21.863746 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:22 crc kubenswrapper[4822]: I1010 08:07:22.339108 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-pp6p9"] Oct 10 08:07:22 crc kubenswrapper[4822]: I1010 08:07:22.613655 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pp6p9" event={"ID":"7a701b49-d6e2-47dc-a329-9b88b163c568","Type":"ContainerStarted","Data":"b863ed598b4aafa37a2cbbb684f429e1da659c8e1cc3ae8b55c32272f08d613a"} Oct 10 08:07:25 crc kubenswrapper[4822]: I1010 08:07:25.274984 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c5d44449-hr4bs" podUID="5228af92-c9c7-494f-957f-0e63f41ca0eb" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Oct 10 08:07:25 crc kubenswrapper[4822]: I1010 08:07:25.578980 4822 scope.go:117] "RemoveContainer" containerID="ba815ef758455e238fed76b31d1a6d4aaef1ab82a0635e4498de556f5a0ec6ed" Oct 10 08:07:28 crc kubenswrapper[4822]: I1010 08:07:28.339937 4822 scope.go:117] "RemoveContainer" containerID="78f555350bc8a215b3b7367b72385696b789d7a0ed01dd7a9bf98e94ce2406ca" Oct 10 08:07:29 crc kubenswrapper[4822]: I1010 08:07:29.692141 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pp6p9" event={"ID":"7a701b49-d6e2-47dc-a329-9b88b163c568","Type":"ContainerStarted","Data":"1806c2420a6ead245afe58116143a58c158d4dc95b705ae7593f931eff8489c8"} Oct 10 08:07:29 crc kubenswrapper[4822]: I1010 08:07:29.713673 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-pp6p9" podStartSLOduration=2.38764866 podStartE2EDuration="8.713650999s" podCreationTimestamp="2025-10-10 08:07:21 +0000 UTC" firstStartedPulling="2025-10-10 08:07:22.356206194 +0000 UTC m=+6189.451364390" lastFinishedPulling="2025-10-10 08:07:28.682208533 +0000 UTC m=+6195.777366729" observedRunningTime="2025-10-10 08:07:29.706274636 +0000 UTC m=+6196.801432842" watchObservedRunningTime="2025-10-10 08:07:29.713650999 +0000 UTC m=+6196.808809195" Oct 10 08:07:31 crc kubenswrapper[4822]: I1010 08:07:31.336529 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:07:31 crc kubenswrapper[4822]: I1010 08:07:31.336981 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:07:31 crc kubenswrapper[4822]: I1010 08:07:31.717775 4822 generic.go:334] "Generic (PLEG): container finished" podID="7a701b49-d6e2-47dc-a329-9b88b163c568" containerID="1806c2420a6ead245afe58116143a58c158d4dc95b705ae7593f931eff8489c8" exitCode=0 Oct 10 08:07:31 crc kubenswrapper[4822]: I1010 08:07:31.717840 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pp6p9" event={"ID":"7a701b49-d6e2-47dc-a329-9b88b163c568","Type":"ContainerDied","Data":"1806c2420a6ead245afe58116143a58c158d4dc95b705ae7593f931eff8489c8"} Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.065569 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8qlfm"] Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.080854 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-j4j2s"] Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.091486 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7vbkg"] Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.101104 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8qlfm"] Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.109822 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7vbkg"] Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.119300 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-j4j2s"] Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.135623 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.261555 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-config-data\") pod \"7a701b49-d6e2-47dc-a329-9b88b163c568\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.261662 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-combined-ca-bundle\") pod \"7a701b49-d6e2-47dc-a329-9b88b163c568\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.261861 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxqnm\" (UniqueName: \"kubernetes.io/projected/7a701b49-d6e2-47dc-a329-9b88b163c568-kube-api-access-cxqnm\") pod \"7a701b49-d6e2-47dc-a329-9b88b163c568\" (UID: \"7a701b49-d6e2-47dc-a329-9b88b163c568\") " Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.268455 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a701b49-d6e2-47dc-a329-9b88b163c568-kube-api-access-cxqnm" (OuterVolumeSpecName: "kube-api-access-cxqnm") pod "7a701b49-d6e2-47dc-a329-9b88b163c568" (UID: "7a701b49-d6e2-47dc-a329-9b88b163c568"). InnerVolumeSpecName "kube-api-access-cxqnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.294691 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a701b49-d6e2-47dc-a329-9b88b163c568" (UID: "7a701b49-d6e2-47dc-a329-9b88b163c568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.347397 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-config-data" (OuterVolumeSpecName: "config-data") pod "7a701b49-d6e2-47dc-a329-9b88b163c568" (UID: "7a701b49-d6e2-47dc-a329-9b88b163c568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.364390 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.364600 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a701b49-d6e2-47dc-a329-9b88b163c568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.364642 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxqnm\" (UniqueName: \"kubernetes.io/projected/7a701b49-d6e2-47dc-a329-9b88b163c568-kube-api-access-cxqnm\") on node \"crc\" DevicePath \"\"" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.666131 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b97456-2a65-4982-9b6c-c8c8557588d3" path="/var/lib/kubelet/pods/04b97456-2a65-4982-9b6c-c8c8557588d3/volumes" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.667026 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c00ec59-c1e4-4cd1-817a-ebff4fcd2284" path="/var/lib/kubelet/pods/4c00ec59-c1e4-4cd1-817a-ebff4fcd2284/volumes" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.667715 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15f4fd3-e407-48bc-921e-181dcc8b9be7" path="/var/lib/kubelet/pods/f15f4fd3-e407-48bc-921e-181dcc8b9be7/volumes" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.738267 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pp6p9" event={"ID":"7a701b49-d6e2-47dc-a329-9b88b163c568","Type":"ContainerDied","Data":"b863ed598b4aafa37a2cbbb684f429e1da659c8e1cc3ae8b55c32272f08d613a"} Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.738325 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b863ed598b4aafa37a2cbbb684f429e1da659c8e1cc3ae8b55c32272f08d613a" Oct 10 08:07:33 crc kubenswrapper[4822]: I1010 08:07:33.738334 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pp6p9" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.152774 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-c46674d4f-g27ln"] Oct 10 08:07:35 crc kubenswrapper[4822]: E1010 08:07:35.153905 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a701b49-d6e2-47dc-a329-9b88b163c568" containerName="heat-db-sync" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.153944 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a701b49-d6e2-47dc-a329-9b88b163c568" containerName="heat-db-sync" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.159535 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a701b49-d6e2-47dc-a329-9b88b163c568" containerName="heat-db-sync" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.160391 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.162773 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.172112 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-69n2h" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.174289 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.193711 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c46674d4f-g27ln"] Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.293676 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7cb6cbddc4-rj72c"] Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.295195 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.305884 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.306993 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-combined-ca-bundle\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.307126 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vljlm\" (UniqueName: \"kubernetes.io/projected/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-kube-api-access-vljlm\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.307560 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-config-data-custom\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.307601 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-config-data\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.316272 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cb6cbddc4-rj72c"] Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.374865 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-57469fb88c-jkzr5"] Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.376339 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.379885 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.382601 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57469fb88c-jkzr5"] Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409125 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-config-data-custom\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409211 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-config-data\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409251 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-combined-ca-bundle\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409288 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-combined-ca-bundle\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409325 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vljlm\" (UniqueName: \"kubernetes.io/projected/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-kube-api-access-vljlm\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409344 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmwv\" (UniqueName: \"kubernetes.io/projected/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-kube-api-access-ldmwv\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409391 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-config-data-custom\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.409422 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-config-data\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.417453 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-combined-ca-bundle\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.421548 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-config-data-custom\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.430571 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vljlm\" (UniqueName: \"kubernetes.io/projected/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-kube-api-access-vljlm\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.442304 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c4f605-9277-4ab5-95c8-4c7e1a2e94b1-config-data\") pod \"heat-engine-c46674d4f-g27ln\" (UID: \"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1\") " pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511027 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmwv\" (UniqueName: \"kubernetes.io/projected/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-kube-api-access-ldmwv\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511094 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-config-data\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511169 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68pfm\" (UniqueName: \"kubernetes.io/projected/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-kube-api-access-68pfm\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511266 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-config-data-custom\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511320 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-config-data-custom\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511358 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-combined-ca-bundle\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511396 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-config-data\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.511448 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-combined-ca-bundle\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.517138 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-config-data-custom\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.517952 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-combined-ca-bundle\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.518998 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-config-data\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.521139 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.538375 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmwv\" (UniqueName: \"kubernetes.io/projected/cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6-kube-api-access-ldmwv\") pod \"heat-api-7cb6cbddc4-rj72c\" (UID: \"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6\") " pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.613503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68pfm\" (UniqueName: \"kubernetes.io/projected/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-kube-api-access-68pfm\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.613692 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-config-data-custom\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.614223 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-combined-ca-bundle\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.614424 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-config-data\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.615506 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.619648 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-config-data-custom\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.637318 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-combined-ca-bundle\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.637604 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-config-data\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.641288 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68pfm\" (UniqueName: \"kubernetes.io/projected/bec9bcc5-65a6-4a3c-b3db-17ba53221d41-kube-api-access-68pfm\") pod \"heat-cfnapi-57469fb88c-jkzr5\" (UID: \"bec9bcc5-65a6-4a3c-b3db-17ba53221d41\") " pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:35 crc kubenswrapper[4822]: I1010 08:07:35.715743 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.166305 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-c46674d4f-g27ln"] Oct 10 08:07:36 crc kubenswrapper[4822]: W1010 08:07:36.172969 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85c4f605_9277_4ab5_95c8_4c7e1a2e94b1.slice/crio-467512ab8a36e3e6567bdb079ccb5de92325e9e09d5ce984e48be6d84c04ebc5 WatchSource:0}: Error finding container 467512ab8a36e3e6567bdb079ccb5de92325e9e09d5ce984e48be6d84c04ebc5: Status 404 returned error can't find the container with id 467512ab8a36e3e6567bdb079ccb5de92325e9e09d5ce984e48be6d84c04ebc5 Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.404911 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7cb6cbddc4-rj72c"] Oct 10 08:07:36 crc kubenswrapper[4822]: W1010 08:07:36.418924 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbb3bc3e_e7d8_4ef3_9e45_2cbeedb031f6.slice/crio-d985ecc2da44db85bc11ee501197479441cdd05e33b6a8e8744d79ecef1ca2ca WatchSource:0}: Error finding container d985ecc2da44db85bc11ee501197479441cdd05e33b6a8e8744d79ecef1ca2ca: Status 404 returned error can't find the container with id d985ecc2da44db85bc11ee501197479441cdd05e33b6a8e8744d79ecef1ca2ca Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.426621 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57469fb88c-jkzr5"] Oct 10 08:07:36 crc kubenswrapper[4822]: W1010 08:07:36.433309 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbec9bcc5_65a6_4a3c_b3db_17ba53221d41.slice/crio-16fa9b5f7f67ab7e977ca907f2b89201a83492e838b865739622449900a42f2d WatchSource:0}: Error finding container 16fa9b5f7f67ab7e977ca907f2b89201a83492e838b865739622449900a42f2d: Status 404 returned error can't find the container with id 16fa9b5f7f67ab7e977ca907f2b89201a83492e838b865739622449900a42f2d Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.797935 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c46674d4f-g27ln" event={"ID":"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1","Type":"ContainerStarted","Data":"a7d5ad602585d1e0f9b07514f291308a2eda4fc47ae978c1a6d018da936a49cb"} Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.798314 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-c46674d4f-g27ln" event={"ID":"85c4f605-9277-4ab5-95c8-4c7e1a2e94b1","Type":"ContainerStarted","Data":"467512ab8a36e3e6567bdb079ccb5de92325e9e09d5ce984e48be6d84c04ebc5"} Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.798405 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.799596 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cb6cbddc4-rj72c" event={"ID":"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6","Type":"ContainerStarted","Data":"d985ecc2da44db85bc11ee501197479441cdd05e33b6a8e8744d79ecef1ca2ca"} Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.802584 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" event={"ID":"bec9bcc5-65a6-4a3c-b3db-17ba53221d41","Type":"ContainerStarted","Data":"16fa9b5f7f67ab7e977ca907f2b89201a83492e838b865739622449900a42f2d"} Oct 10 08:07:36 crc kubenswrapper[4822]: I1010 08:07:36.825225 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-c46674d4f-g27ln" podStartSLOduration=1.825202794 podStartE2EDuration="1.825202794s" podCreationTimestamp="2025-10-10 08:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:07:36.816155353 +0000 UTC m=+6203.911313569" watchObservedRunningTime="2025-10-10 08:07:36.825202794 +0000 UTC m=+6203.920360990" Oct 10 08:07:37 crc kubenswrapper[4822]: I1010 08:07:37.737310 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:39 crc kubenswrapper[4822]: I1010 08:07:39.846386 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7cb6cbddc4-rj72c" event={"ID":"cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6","Type":"ContainerStarted","Data":"76573d7ef6b4c4ec162566ff5ff3e79de53c3993478020315a56efb682fcfbbd"} Oct 10 08:07:39 crc kubenswrapper[4822]: I1010 08:07:39.847070 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:39 crc kubenswrapper[4822]: I1010 08:07:39.861026 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" event={"ID":"bec9bcc5-65a6-4a3c-b3db-17ba53221d41","Type":"ContainerStarted","Data":"c1951ebc2e446aff39eb4242ba05b936b81b8543c9a6d641e8c926e0c05fda11"} Oct 10 08:07:39 crc kubenswrapper[4822]: I1010 08:07:39.861952 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:39 crc kubenswrapper[4822]: I1010 08:07:39.888242 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7cb6cbddc4-rj72c" podStartSLOduration=2.655952763 podStartE2EDuration="4.888223966s" podCreationTimestamp="2025-10-10 08:07:35 +0000 UTC" firstStartedPulling="2025-10-10 08:07:36.422771592 +0000 UTC m=+6203.517929788" lastFinishedPulling="2025-10-10 08:07:38.655042795 +0000 UTC m=+6205.750200991" observedRunningTime="2025-10-10 08:07:39.884106177 +0000 UTC m=+6206.979264383" watchObservedRunningTime="2025-10-10 08:07:39.888223966 +0000 UTC m=+6206.983382162" Oct 10 08:07:40 crc kubenswrapper[4822]: I1010 08:07:40.094221 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54c5d44449-hr4bs" Oct 10 08:07:40 crc kubenswrapper[4822]: I1010 08:07:40.135632 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" podStartSLOduration=2.921009305 podStartE2EDuration="5.135609378s" podCreationTimestamp="2025-10-10 08:07:35 +0000 UTC" firstStartedPulling="2025-10-10 08:07:36.438581988 +0000 UTC m=+6203.533740184" lastFinishedPulling="2025-10-10 08:07:38.653182061 +0000 UTC m=+6205.748340257" observedRunningTime="2025-10-10 08:07:39.922272038 +0000 UTC m=+6207.017430234" watchObservedRunningTime="2025-10-10 08:07:40.135609378 +0000 UTC m=+6207.230767574" Oct 10 08:07:40 crc kubenswrapper[4822]: I1010 08:07:40.168785 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bc8479b8f-dzrp9"] Oct 10 08:07:40 crc kubenswrapper[4822]: I1010 08:07:40.169639 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6bc8479b8f-dzrp9" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon-log" containerID="cri-o://469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde" gracePeriod=30 Oct 10 08:07:40 crc kubenswrapper[4822]: I1010 08:07:40.170096 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6bc8479b8f-dzrp9" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" containerID="cri-o://9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e" gracePeriod=30 Oct 10 08:07:43 crc kubenswrapper[4822]: I1010 08:07:43.058514 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2e5e-account-create-xkp95"] Oct 10 08:07:43 crc kubenswrapper[4822]: I1010 08:07:43.069900 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2e5e-account-create-xkp95"] Oct 10 08:07:43 crc kubenswrapper[4822]: I1010 08:07:43.662853 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e24cfb7-bc5a-4b10-8b4f-a4df917256f7" path="/var/lib/kubelet/pods/3e24cfb7-bc5a-4b10-8b4f-a4df917256f7/volumes" Oct 10 08:07:43 crc kubenswrapper[4822]: I1010 08:07:43.902350 4822 generic.go:334] "Generic (PLEG): container finished" podID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerID="9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e" exitCode=0 Oct 10 08:07:43 crc kubenswrapper[4822]: I1010 08:07:43.902428 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc8479b8f-dzrp9" event={"ID":"ef5a91b9-7681-41ac-9a5c-77d041506bea","Type":"ContainerDied","Data":"9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e"} Oct 10 08:07:44 crc kubenswrapper[4822]: I1010 08:07:44.040538 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f4e8-account-create-s5f9z"] Oct 10 08:07:44 crc kubenswrapper[4822]: I1010 08:07:44.050958 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-eaaf-account-create-qdkvx"] Oct 10 08:07:44 crc kubenswrapper[4822]: I1010 08:07:44.059987 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f4e8-account-create-s5f9z"] Oct 10 08:07:44 crc kubenswrapper[4822]: I1010 08:07:44.068198 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-eaaf-account-create-qdkvx"] Oct 10 08:07:45 crc kubenswrapper[4822]: I1010 08:07:45.661729 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d4b476-bea7-469c-9cf2-2b50e729ab7b" path="/var/lib/kubelet/pods/a3d4b476-bea7-469c-9cf2-2b50e729ab7b/volumes" Oct 10 08:07:45 crc kubenswrapper[4822]: I1010 08:07:45.662764 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a355f7-9246-47ac-9524-15e54066c591" path="/var/lib/kubelet/pods/e8a355f7-9246-47ac-9524-15e54066c591/volumes" Oct 10 08:07:47 crc kubenswrapper[4822]: I1010 08:07:47.226505 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7cb6cbddc4-rj72c" Oct 10 08:07:47 crc kubenswrapper[4822]: I1010 08:07:47.807935 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-57469fb88c-jkzr5" Oct 10 08:07:48 crc kubenswrapper[4822]: I1010 08:07:48.436698 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6bc8479b8f-dzrp9" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 10 08:07:52 crc kubenswrapper[4822]: I1010 08:07:52.042336 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmtt7"] Oct 10 08:07:52 crc kubenswrapper[4822]: I1010 08:07:52.053977 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmtt7"] Oct 10 08:07:53 crc kubenswrapper[4822]: I1010 08:07:53.668775 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ae4f6b-1cd9-497b-9665-25b570793997" path="/var/lib/kubelet/pods/54ae4f6b-1cd9-497b-9665-25b570793997/volumes" Oct 10 08:07:55 crc kubenswrapper[4822]: I1010 08:07:55.563670 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-c46674d4f-g27ln" Oct 10 08:07:58 crc kubenswrapper[4822]: I1010 08:07:58.437141 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6bc8479b8f-dzrp9" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 10 08:08:01 crc kubenswrapper[4822]: I1010 08:08:01.337565 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:08:01 crc kubenswrapper[4822]: I1010 08:08:01.338295 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:08:01 crc kubenswrapper[4822]: I1010 08:08:01.338380 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:08:01 crc kubenswrapper[4822]: I1010 08:08:01.339396 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7499bf42536c70e0590eda28c91355142cca10772ff9c8c9df99fd82114164a5"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:08:01 crc kubenswrapper[4822]: I1010 08:08:01.339477 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://7499bf42536c70e0590eda28c91355142cca10772ff9c8c9df99fd82114164a5" gracePeriod=600 Oct 10 08:08:02 crc kubenswrapper[4822]: I1010 08:08:02.088601 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="7499bf42536c70e0590eda28c91355142cca10772ff9c8c9df99fd82114164a5" exitCode=0 Oct 10 08:08:02 crc kubenswrapper[4822]: I1010 08:08:02.088682 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"7499bf42536c70e0590eda28c91355142cca10772ff9c8c9df99fd82114164a5"} Oct 10 08:08:02 crc kubenswrapper[4822]: I1010 08:08:02.089131 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab"} Oct 10 08:08:02 crc kubenswrapper[4822]: I1010 08:08:02.089161 4822 scope.go:117] "RemoveContainer" containerID="f4486863da5fcea45954fc1372651823ac820a9d5c54e369953b1dc80c620df4" Oct 10 08:08:04 crc kubenswrapper[4822]: I1010 08:08:04.860094 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42"] Oct 10 08:08:04 crc kubenswrapper[4822]: I1010 08:08:04.862849 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:04 crc kubenswrapper[4822]: I1010 08:08:04.865372 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 08:08:04 crc kubenswrapper[4822]: I1010 08:08:04.879979 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42"] Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.027974 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74jx\" (UniqueName: \"kubernetes.io/projected/13771e73-9a69-4d77-91da-c3d6f058b6b3-kube-api-access-l74jx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.028120 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.028193 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.130048 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.130171 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74jx\" (UniqueName: \"kubernetes.io/projected/13771e73-9a69-4d77-91da-c3d6f058b6b3-kube-api-access-l74jx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.130240 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.130882 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.131168 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.156515 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74jx\" (UniqueName: \"kubernetes.io/projected/13771e73-9a69-4d77-91da-c3d6f058b6b3-kube-api-access-l74jx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.193719 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:05 crc kubenswrapper[4822]: I1010 08:08:05.684286 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42"] Oct 10 08:08:06 crc kubenswrapper[4822]: I1010 08:08:06.135701 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" event={"ID":"13771e73-9a69-4d77-91da-c3d6f058b6b3","Type":"ContainerStarted","Data":"6a30d32a1edfac6a74211d43e4ae6dbb6d7a96e428c2478ee2ce1d42ebbdd892"} Oct 10 08:08:06 crc kubenswrapper[4822]: I1010 08:08:06.135771 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" event={"ID":"13771e73-9a69-4d77-91da-c3d6f058b6b3","Type":"ContainerStarted","Data":"345bf76ad2e60372ae3453bdc4bec96ece1baa0b1e82e86deb18f2588c261982"} Oct 10 08:08:07 crc kubenswrapper[4822]: I1010 08:08:07.147943 4822 generic.go:334] "Generic (PLEG): container finished" podID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerID="6a30d32a1edfac6a74211d43e4ae6dbb6d7a96e428c2478ee2ce1d42ebbdd892" exitCode=0 Oct 10 08:08:07 crc kubenswrapper[4822]: I1010 08:08:07.148126 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" event={"ID":"13771e73-9a69-4d77-91da-c3d6f058b6b3","Type":"ContainerDied","Data":"6a30d32a1edfac6a74211d43e4ae6dbb6d7a96e428c2478ee2ce1d42ebbdd892"} Oct 10 08:08:07 crc kubenswrapper[4822]: I1010 08:08:07.153536 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:08:08 crc kubenswrapper[4822]: I1010 08:08:08.436614 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6bc8479b8f-dzrp9" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 10 08:08:08 crc kubenswrapper[4822]: I1010 08:08:08.438041 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:08:09 crc kubenswrapper[4822]: I1010 08:08:09.176379 4822 generic.go:334] "Generic (PLEG): container finished" podID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerID="bf8eec38ea2591778ff844286b7fc2fb5391159610462340eadfbcc1781b8cab" exitCode=0 Oct 10 08:08:09 crc kubenswrapper[4822]: I1010 08:08:09.176444 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" event={"ID":"13771e73-9a69-4d77-91da-c3d6f058b6b3","Type":"ContainerDied","Data":"bf8eec38ea2591778ff844286b7fc2fb5391159610462340eadfbcc1781b8cab"} Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.197344 4822 generic.go:334] "Generic (PLEG): container finished" podID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerID="9eb350f9da0e8c363e73d2af650a606f930cab192d90041e3303f7674fdbb000" exitCode=0 Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.198497 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" event={"ID":"13771e73-9a69-4d77-91da-c3d6f058b6b3","Type":"ContainerDied","Data":"9eb350f9da0e8c363e73d2af650a606f930cab192d90041e3303f7674fdbb000"} Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.674578 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.756387 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-scripts\") pod \"ef5a91b9-7681-41ac-9a5c-77d041506bea\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.756450 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef5a91b9-7681-41ac-9a5c-77d041506bea-logs\") pod \"ef5a91b9-7681-41ac-9a5c-77d041506bea\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.756512 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/ef5a91b9-7681-41ac-9a5c-77d041506bea-kube-api-access-vpwr7\") pod \"ef5a91b9-7681-41ac-9a5c-77d041506bea\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.756546 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-config-data\") pod \"ef5a91b9-7681-41ac-9a5c-77d041506bea\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.756564 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef5a91b9-7681-41ac-9a5c-77d041506bea-horizon-secret-key\") pod \"ef5a91b9-7681-41ac-9a5c-77d041506bea\" (UID: \"ef5a91b9-7681-41ac-9a5c-77d041506bea\") " Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.757422 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5a91b9-7681-41ac-9a5c-77d041506bea-logs" (OuterVolumeSpecName: "logs") pod "ef5a91b9-7681-41ac-9a5c-77d041506bea" (UID: "ef5a91b9-7681-41ac-9a5c-77d041506bea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.768041 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5a91b9-7681-41ac-9a5c-77d041506bea-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ef5a91b9-7681-41ac-9a5c-77d041506bea" (UID: "ef5a91b9-7681-41ac-9a5c-77d041506bea"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.768197 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5a91b9-7681-41ac-9a5c-77d041506bea-kube-api-access-vpwr7" (OuterVolumeSpecName: "kube-api-access-vpwr7") pod "ef5a91b9-7681-41ac-9a5c-77d041506bea" (UID: "ef5a91b9-7681-41ac-9a5c-77d041506bea"). InnerVolumeSpecName "kube-api-access-vpwr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.786507 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-scripts" (OuterVolumeSpecName: "scripts") pod "ef5a91b9-7681-41ac-9a5c-77d041506bea" (UID: "ef5a91b9-7681-41ac-9a5c-77d041506bea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.795493 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-config-data" (OuterVolumeSpecName: "config-data") pod "ef5a91b9-7681-41ac-9a5c-77d041506bea" (UID: "ef5a91b9-7681-41ac-9a5c-77d041506bea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.859191 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.859243 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef5a91b9-7681-41ac-9a5c-77d041506bea-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.859256 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/ef5a91b9-7681-41ac-9a5c-77d041506bea-kube-api-access-vpwr7\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.859271 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef5a91b9-7681-41ac-9a5c-77d041506bea-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:10 crc kubenswrapper[4822]: I1010 08:08:10.859306 4822 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef5a91b9-7681-41ac-9a5c-77d041506bea-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.019709 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bgbwm"] Oct 10 08:08:11 crc kubenswrapper[4822]: E1010 08:08:11.020242 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.020260 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" Oct 10 08:08:11 crc kubenswrapper[4822]: E1010 08:08:11.020297 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon-log" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.020305 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon-log" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.020596 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon-log" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.020620 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerName="horizon" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.022728 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.037384 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgbwm"] Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.063265 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8prc\" (UniqueName: \"kubernetes.io/projected/485bf337-e9e8-4609-af6a-2733ec9db8a7-kube-api-access-l8prc\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.063435 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-utilities\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.063543 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-catalog-content\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.166254 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8prc\" (UniqueName: \"kubernetes.io/projected/485bf337-e9e8-4609-af6a-2733ec9db8a7-kube-api-access-l8prc\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.166517 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-utilities\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.166693 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-catalog-content\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.167498 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-utilities\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.167846 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-catalog-content\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.184608 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8prc\" (UniqueName: \"kubernetes.io/projected/485bf337-e9e8-4609-af6a-2733ec9db8a7-kube-api-access-l8prc\") pod \"certified-operators-bgbwm\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.213612 4822 generic.go:334] "Generic (PLEG): container finished" podID="ef5a91b9-7681-41ac-9a5c-77d041506bea" containerID="469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde" exitCode=137 Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.213892 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc8479b8f-dzrp9" event={"ID":"ef5a91b9-7681-41ac-9a5c-77d041506bea","Type":"ContainerDied","Data":"469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde"} Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.213953 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bc8479b8f-dzrp9" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.215367 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bc8479b8f-dzrp9" event={"ID":"ef5a91b9-7681-41ac-9a5c-77d041506bea","Type":"ContainerDied","Data":"1efe264c6fd79ed1958ad7d521803f791b8e8f5e55f87710cfcb690a5117ccbd"} Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.215526 4822 scope.go:117] "RemoveContainer" containerID="9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.270932 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bc8479b8f-dzrp9"] Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.280769 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6bc8479b8f-dzrp9"] Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.370652 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.440442 4822 scope.go:117] "RemoveContainer" containerID="469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.473099 4822 scope.go:117] "RemoveContainer" containerID="9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e" Oct 10 08:08:11 crc kubenswrapper[4822]: E1010 08:08:11.474260 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e\": container with ID starting with 9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e not found: ID does not exist" containerID="9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.474285 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e"} err="failed to get container status \"9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e\": rpc error: code = NotFound desc = could not find container \"9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e\": container with ID starting with 9e6a46fcb650099297e89d4704d4cffb7e303992101fd4a344e57d1df138188e not found: ID does not exist" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.474306 4822 scope.go:117] "RemoveContainer" containerID="469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde" Oct 10 08:08:11 crc kubenswrapper[4822]: E1010 08:08:11.474655 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde\": container with ID starting with 469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde not found: ID does not exist" containerID="469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.474682 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde"} err="failed to get container status \"469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde\": rpc error: code = NotFound desc = could not find container \"469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde\": container with ID starting with 469c55529a2d70796b0af00095a4127b96aa2d6cadc7ab0a2dad4272f83d6bde not found: ID does not exist" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.587220 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.673631 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5a91b9-7681-41ac-9a5c-77d041506bea" path="/var/lib/kubelet/pods/ef5a91b9-7681-41ac-9a5c-77d041506bea/volumes" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.783482 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-util\") pod \"13771e73-9a69-4d77-91da-c3d6f058b6b3\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.783545 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-bundle\") pod \"13771e73-9a69-4d77-91da-c3d6f058b6b3\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.783595 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74jx\" (UniqueName: \"kubernetes.io/projected/13771e73-9a69-4d77-91da-c3d6f058b6b3-kube-api-access-l74jx\") pod \"13771e73-9a69-4d77-91da-c3d6f058b6b3\" (UID: \"13771e73-9a69-4d77-91da-c3d6f058b6b3\") " Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.787436 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-bundle" (OuterVolumeSpecName: "bundle") pod "13771e73-9a69-4d77-91da-c3d6f058b6b3" (UID: "13771e73-9a69-4d77-91da-c3d6f058b6b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.797373 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-util" (OuterVolumeSpecName: "util") pod "13771e73-9a69-4d77-91da-c3d6f058b6b3" (UID: "13771e73-9a69-4d77-91da-c3d6f058b6b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.810194 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13771e73-9a69-4d77-91da-c3d6f058b6b3-kube-api-access-l74jx" (OuterVolumeSpecName: "kube-api-access-l74jx") pod "13771e73-9a69-4d77-91da-c3d6f058b6b3" (UID: "13771e73-9a69-4d77-91da-c3d6f058b6b3"). InnerVolumeSpecName "kube-api-access-l74jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.885113 4822 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-util\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.885146 4822 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13771e73-9a69-4d77-91da-c3d6f058b6b3-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.885158 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l74jx\" (UniqueName: \"kubernetes.io/projected/13771e73-9a69-4d77-91da-c3d6f058b6b3-kube-api-access-l74jx\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:11 crc kubenswrapper[4822]: I1010 08:08:11.978865 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgbwm"] Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.038578 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wt8m4"] Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.058453 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lmd99"] Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.071728 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wt8m4"] Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.084677 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lmd99"] Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.229728 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" event={"ID":"13771e73-9a69-4d77-91da-c3d6f058b6b3","Type":"ContainerDied","Data":"345bf76ad2e60372ae3453bdc4bec96ece1baa0b1e82e86deb18f2588c261982"} Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.229786 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="345bf76ad2e60372ae3453bdc4bec96ece1baa0b1e82e86deb18f2588c261982" Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.231105 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42" Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.233212 4822 generic.go:334] "Generic (PLEG): container finished" podID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerID="9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005" exitCode=0 Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.233443 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerDied","Data":"9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005"} Oct 10 08:08:12 crc kubenswrapper[4822]: I1010 08:08:12.233478 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerStarted","Data":"5d15c082075afb8fa3f1c727af4efb7060313ab247712e313322381864c11a48"} Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.011293 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89fgb"] Oct 10 08:08:13 crc kubenswrapper[4822]: E1010 08:08:13.012460 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="util" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.012491 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="util" Oct 10 08:08:13 crc kubenswrapper[4822]: E1010 08:08:13.012498 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="extract" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.012507 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="extract" Oct 10 08:08:13 crc kubenswrapper[4822]: E1010 08:08:13.012551 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="pull" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.012559 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="pull" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.012819 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="13771e73-9a69-4d77-91da-c3d6f058b6b3" containerName="extract" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.015265 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.057342 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89fgb"] Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.115563 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-catalog-content\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.115843 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-utilities\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.115940 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw9sq\" (UniqueName: \"kubernetes.io/projected/aca3e124-87df-4b67-a39a-f046eb941bfb-kube-api-access-hw9sq\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.217765 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-utilities\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.217868 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw9sq\" (UniqueName: \"kubernetes.io/projected/aca3e124-87df-4b67-a39a-f046eb941bfb-kube-api-access-hw9sq\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.217989 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-catalog-content\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.218439 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-utilities\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.219198 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-catalog-content\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.237852 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw9sq\" (UniqueName: \"kubernetes.io/projected/aca3e124-87df-4b67-a39a-f046eb941bfb-kube-api-access-hw9sq\") pod \"community-operators-89fgb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.257412 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerStarted","Data":"9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd"} Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.339242 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.663457 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10209d33-8de9-4152-a66e-e34b045618b4" path="/var/lib/kubelet/pods/10209d33-8de9-4152-a66e-e34b045618b4/volumes" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.678656 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1dfed5c-0c78-4bfb-b4e5-bf19d986619a" path="/var/lib/kubelet/pods/c1dfed5c-0c78-4bfb-b4e5-bf19d986619a/volumes" Oct 10 08:08:13 crc kubenswrapper[4822]: I1010 08:08:13.913772 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89fgb"] Oct 10 08:08:13 crc kubenswrapper[4822]: W1010 08:08:13.918505 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca3e124_87df_4b67_a39a_f046eb941bfb.slice/crio-b45c3a1688fc993ac6ab5462c6ab5183fdf5f60b652f3f0deb56e9afbc2760c8 WatchSource:0}: Error finding container b45c3a1688fc993ac6ab5462c6ab5183fdf5f60b652f3f0deb56e9afbc2760c8: Status 404 returned error can't find the container with id b45c3a1688fc993ac6ab5462c6ab5183fdf5f60b652f3f0deb56e9afbc2760c8 Oct 10 08:08:14 crc kubenswrapper[4822]: I1010 08:08:14.268639 4822 generic.go:334] "Generic (PLEG): container finished" podID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerID="9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd" exitCode=0 Oct 10 08:08:14 crc kubenswrapper[4822]: I1010 08:08:14.268712 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerDied","Data":"9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd"} Oct 10 08:08:14 crc kubenswrapper[4822]: I1010 08:08:14.269966 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerStarted","Data":"b45c3a1688fc993ac6ab5462c6ab5183fdf5f60b652f3f0deb56e9afbc2760c8"} Oct 10 08:08:15 crc kubenswrapper[4822]: I1010 08:08:15.294029 4822 generic.go:334] "Generic (PLEG): container finished" podID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerID="e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173" exitCode=0 Oct 10 08:08:15 crc kubenswrapper[4822]: I1010 08:08:15.294449 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerDied","Data":"e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173"} Oct 10 08:08:16 crc kubenswrapper[4822]: I1010 08:08:16.349682 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerStarted","Data":"a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe"} Oct 10 08:08:16 crc kubenswrapper[4822]: I1010 08:08:16.352215 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerStarted","Data":"83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf"} Oct 10 08:08:16 crc kubenswrapper[4822]: I1010 08:08:16.376834 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bgbwm" podStartSLOduration=3.7641561489999997 podStartE2EDuration="6.376788187s" podCreationTimestamp="2025-10-10 08:08:10 +0000 UTC" firstStartedPulling="2025-10-10 08:08:12.237722654 +0000 UTC m=+6239.332880850" lastFinishedPulling="2025-10-10 08:08:14.850354692 +0000 UTC m=+6241.945512888" observedRunningTime="2025-10-10 08:08:16.370621179 +0000 UTC m=+6243.465779395" watchObservedRunningTime="2025-10-10 08:08:16.376788187 +0000 UTC m=+6243.471946383" Oct 10 08:08:19 crc kubenswrapper[4822]: I1010 08:08:19.382854 4822 generic.go:334] "Generic (PLEG): container finished" podID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerID="83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf" exitCode=0 Oct 10 08:08:19 crc kubenswrapper[4822]: I1010 08:08:19.382947 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerDied","Data":"83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf"} Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.275747 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.304362 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.304486 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.306905 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-csn59" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.307864 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.308359 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.395665 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb766\" (UniqueName: \"kubernetes.io/projected/1d0456c8-3612-481a-a98d-369c33a68812-kube-api-access-qb766\") pod \"obo-prometheus-operator-7c8cf85677-22bmw\" (UID: \"1d0456c8-3612-481a-a98d-369c33a68812\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.397196 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerStarted","Data":"2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922"} Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.407126 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.408618 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.412538 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.413123 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qncr2" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.439977 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.441868 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.466012 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.487312 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.491235 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89fgb" podStartSLOduration=4.020734941 podStartE2EDuration="8.491211839s" podCreationTimestamp="2025-10-10 08:08:12 +0000 UTC" firstStartedPulling="2025-10-10 08:08:15.301695433 +0000 UTC m=+6242.396853629" lastFinishedPulling="2025-10-10 08:08:19.772172341 +0000 UTC m=+6246.867330527" observedRunningTime="2025-10-10 08:08:20.440741004 +0000 UTC m=+6247.535899200" watchObservedRunningTime="2025-10-10 08:08:20.491211839 +0000 UTC m=+6247.586370035" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.497509 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70fba8c6-e26c-4600-857e-8728d6a7095e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4\" (UID: \"70fba8c6-e26c-4600-857e-8728d6a7095e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.497562 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa3f5246-4973-4251-990f-4e6089a952ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg\" (UID: \"aa3f5246-4973-4251-990f-4e6089a952ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.497592 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa3f5246-4973-4251-990f-4e6089a952ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg\" (UID: \"aa3f5246-4973-4251-990f-4e6089a952ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.497716 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb766\" (UniqueName: \"kubernetes.io/projected/1d0456c8-3612-481a-a98d-369c33a68812-kube-api-access-qb766\") pod \"obo-prometheus-operator-7c8cf85677-22bmw\" (UID: \"1d0456c8-3612-481a-a98d-369c33a68812\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.498201 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70fba8c6-e26c-4600-857e-8728d6a7095e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4\" (UID: \"70fba8c6-e26c-4600-857e-8728d6a7095e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.534635 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb766\" (UniqueName: \"kubernetes.io/projected/1d0456c8-3612-481a-a98d-369c33a68812-kube-api-access-qb766\") pod \"obo-prometheus-operator-7c8cf85677-22bmw\" (UID: \"1d0456c8-3612-481a-a98d-369c33a68812\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.599635 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70fba8c6-e26c-4600-857e-8728d6a7095e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4\" (UID: \"70fba8c6-e26c-4600-857e-8728d6a7095e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.599706 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70fba8c6-e26c-4600-857e-8728d6a7095e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4\" (UID: \"70fba8c6-e26c-4600-857e-8728d6a7095e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.599735 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa3f5246-4973-4251-990f-4e6089a952ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg\" (UID: \"aa3f5246-4973-4251-990f-4e6089a952ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.599764 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa3f5246-4973-4251-990f-4e6089a952ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg\" (UID: \"aa3f5246-4973-4251-990f-4e6089a952ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.605301 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa3f5246-4973-4251-990f-4e6089a952ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg\" (UID: \"aa3f5246-4973-4251-990f-4e6089a952ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.607520 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70fba8c6-e26c-4600-857e-8728d6a7095e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4\" (UID: \"70fba8c6-e26c-4600-857e-8728d6a7095e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.607978 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa3f5246-4973-4251-990f-4e6089a952ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg\" (UID: \"aa3f5246-4973-4251-990f-4e6089a952ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.609261 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70fba8c6-e26c-4600-857e-8728d6a7095e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4\" (UID: \"70fba8c6-e26c-4600-857e-8728d6a7095e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.622753 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-xhnjq"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.624206 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.628166 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.628506 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-nth22" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.651696 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-xhnjq"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.671584 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.702025 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k62z\" (UniqueName: \"kubernetes.io/projected/97c2c1f4-1f4a-4f37-9435-80f0b49de473-kube-api-access-5k62z\") pod \"observability-operator-cc5f78dfc-xhnjq\" (UID: \"97c2c1f4-1f4a-4f37-9435-80f0b49de473\") " pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.702160 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/97c2c1f4-1f4a-4f37-9435-80f0b49de473-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-xhnjq\" (UID: \"97c2c1f4-1f4a-4f37-9435-80f0b49de473\") " pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.740355 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.769767 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.804786 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k62z\" (UniqueName: \"kubernetes.io/projected/97c2c1f4-1f4a-4f37-9435-80f0b49de473-kube-api-access-5k62z\") pod \"observability-operator-cc5f78dfc-xhnjq\" (UID: \"97c2c1f4-1f4a-4f37-9435-80f0b49de473\") " pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.804885 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/97c2c1f4-1f4a-4f37-9435-80f0b49de473-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-xhnjq\" (UID: \"97c2c1f4-1f4a-4f37-9435-80f0b49de473\") " pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.806697 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-slvtm"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.808291 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.810376 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/97c2c1f4-1f4a-4f37-9435-80f0b49de473-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-xhnjq\" (UID: \"97c2c1f4-1f4a-4f37-9435-80f0b49de473\") " pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.813202 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6swgg" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.819129 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-slvtm"] Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.835059 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k62z\" (UniqueName: \"kubernetes.io/projected/97c2c1f4-1f4a-4f37-9435-80f0b49de473-kube-api-access-5k62z\") pod \"observability-operator-cc5f78dfc-xhnjq\" (UID: \"97c2c1f4-1f4a-4f37-9435-80f0b49de473\") " pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.906741 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/59d48973-1a2f-48f9-b685-62961213d13e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-slvtm\" (UID: \"59d48973-1a2f-48f9-b685-62961213d13e\") " pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:20 crc kubenswrapper[4822]: I1010 08:08:20.906846 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bhls\" (UniqueName: \"kubernetes.io/projected/59d48973-1a2f-48f9-b685-62961213d13e-kube-api-access-7bhls\") pod \"perses-operator-54bc95c9fb-slvtm\" (UID: \"59d48973-1a2f-48f9-b685-62961213d13e\") " pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.011194 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/59d48973-1a2f-48f9-b685-62961213d13e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-slvtm\" (UID: \"59d48973-1a2f-48f9-b685-62961213d13e\") " pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.011293 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bhls\" (UniqueName: \"kubernetes.io/projected/59d48973-1a2f-48f9-b685-62961213d13e-kube-api-access-7bhls\") pod \"perses-operator-54bc95c9fb-slvtm\" (UID: \"59d48973-1a2f-48f9-b685-62961213d13e\") " pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.015268 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/59d48973-1a2f-48f9-b685-62961213d13e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-slvtm\" (UID: \"59d48973-1a2f-48f9-b685-62961213d13e\") " pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.053585 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.069793 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bhls\" (UniqueName: \"kubernetes.io/projected/59d48973-1a2f-48f9-b685-62961213d13e-kube-api-access-7bhls\") pod \"perses-operator-54bc95c9fb-slvtm\" (UID: \"59d48973-1a2f-48f9-b685-62961213d13e\") " pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.250584 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.371822 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.371868 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.584846 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg"] Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.599102 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw"] Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.745249 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4"] Oct 10 08:08:21 crc kubenswrapper[4822]: W1010 08:08:21.949736 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c2c1f4_1f4a_4f37_9435_80f0b49de473.slice/crio-d24dc4f55f13aca298065d2723cdca364d027a8d2a1b70c22f2863795d2c0768 WatchSource:0}: Error finding container d24dc4f55f13aca298065d2723cdca364d027a8d2a1b70c22f2863795d2c0768: Status 404 returned error can't find the container with id d24dc4f55f13aca298065d2723cdca364d027a8d2a1b70c22f2863795d2c0768 Oct 10 08:08:21 crc kubenswrapper[4822]: I1010 08:08:21.952204 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-xhnjq"] Oct 10 08:08:22 crc kubenswrapper[4822]: W1010 08:08:22.059620 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59d48973_1a2f_48f9_b685_62961213d13e.slice/crio-11ef554ffbb9171d17dc79d0d94967d2ff13d8420da26f26e2b4a62c8f3d3004 WatchSource:0}: Error finding container 11ef554ffbb9171d17dc79d0d94967d2ff13d8420da26f26e2b4a62c8f3d3004: Status 404 returned error can't find the container with id 11ef554ffbb9171d17dc79d0d94967d2ff13d8420da26f26e2b4a62c8f3d3004 Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.062668 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-slvtm"] Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.440537 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bgbwm" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="registry-server" probeResult="failure" output=< Oct 10 08:08:22 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:08:22 crc kubenswrapper[4822]: > Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.462598 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" event={"ID":"97c2c1f4-1f4a-4f37-9435-80f0b49de473","Type":"ContainerStarted","Data":"d24dc4f55f13aca298065d2723cdca364d027a8d2a1b70c22f2863795d2c0768"} Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.470453 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" event={"ID":"1d0456c8-3612-481a-a98d-369c33a68812","Type":"ContainerStarted","Data":"14f7b2c02bb988ad73538bbded9b056bd8e3eedcafcbff19bc665219cbb46572"} Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.472942 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" event={"ID":"70fba8c6-e26c-4600-857e-8728d6a7095e","Type":"ContainerStarted","Data":"cc52c348218e06d23b95445f183245f4701a60187b335647cfdee4eba107f229"} Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.478438 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" event={"ID":"aa3f5246-4973-4251-990f-4e6089a952ad","Type":"ContainerStarted","Data":"b97a08ab7da9053ef1443e4a346e90eb5e1df3df3618740e47392dea5e06ec3c"} Oct 10 08:08:22 crc kubenswrapper[4822]: I1010 08:08:22.479512 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" event={"ID":"59d48973-1a2f-48f9-b685-62961213d13e","Type":"ContainerStarted","Data":"11ef554ffbb9171d17dc79d0d94967d2ff13d8420da26f26e2b4a62c8f3d3004"} Oct 10 08:08:23 crc kubenswrapper[4822]: I1010 08:08:23.340219 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:23 crc kubenswrapper[4822]: I1010 08:08:23.340536 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:24 crc kubenswrapper[4822]: I1010 08:08:24.394539 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-89fgb" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="registry-server" probeResult="failure" output=< Oct 10 08:08:24 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:08:24 crc kubenswrapper[4822]: > Oct 10 08:08:28 crc kubenswrapper[4822]: I1010 08:08:28.772463 4822 scope.go:117] "RemoveContainer" containerID="008e14d1ecfa29f5cd6fbec1ead9cc560e14dde0a74db23761b2e030d4a9f872" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.515457 4822 scope.go:117] "RemoveContainer" containerID="fe189be58c662505c424407541f1be96d092727f14a14af1ad08a9ad47f129fa" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.633620 4822 scope.go:117] "RemoveContainer" containerID="7e0712dae4e893ae4487877923c750d2d3b120c6e4cc787b8b025a7b8df38942" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.753425 4822 scope.go:117] "RemoveContainer" containerID="2e07c28ad20d05c5e64db0fe2e34034b4411b952d55e45cba191f43ce01c7a1b" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.783681 4822 scope.go:117] "RemoveContainer" containerID="6a9f61599a80c837ee3233ca76d0d97bc873d5ea3ad5f72c4039992b577bdf2f" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.833613 4822 scope.go:117] "RemoveContainer" containerID="8b1c79468a6144e4550065b536d488eba6c7caee18b65d8a4ba663cd6eecddc1" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.863036 4822 scope.go:117] "RemoveContainer" containerID="a9af087ffe6d175ba49fb19b7a4029cf02b0fc8d7c57e0dbf3aa864c0e4befe9" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.923646 4822 scope.go:117] "RemoveContainer" containerID="4676f672928c1465d6cbbd7ea59a90a8716372df09d663ea5ea45bdc9b07e188" Oct 10 08:08:30 crc kubenswrapper[4822]: I1010 08:08:30.944313 4822 scope.go:117] "RemoveContainer" containerID="df039484ed1645dd1bd960c601929ef9ecf2a3910b85aeee146870fb07199795" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.044044 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zv84p"] Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.053013 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zv84p"] Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.427973 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.493455 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.641727 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" event={"ID":"1d0456c8-3612-481a-a98d-369c33a68812","Type":"ContainerStarted","Data":"4a4708d5f15de2a9b67f9809b544457f35f387db352b6db9d42a33a310c98420"} Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.645323 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" event={"ID":"aa3f5246-4973-4251-990f-4e6089a952ad","Type":"ContainerStarted","Data":"9292b6ba1d5595259200bc88b9f34b60e866e675c72b09119aaa502869ec86ff"} Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.648569 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" event={"ID":"59d48973-1a2f-48f9-b685-62961213d13e","Type":"ContainerStarted","Data":"c8d0cd01a402c8149fb58fde8583bcfaa63bcac2e814c5cb489560fee44a3b99"} Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.649512 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.674383 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-22bmw" podStartSLOduration=5.765250692 podStartE2EDuration="11.674354512s" podCreationTimestamp="2025-10-10 08:08:20 +0000 UTC" firstStartedPulling="2025-10-10 08:08:21.700092689 +0000 UTC m=+6248.795250885" lastFinishedPulling="2025-10-10 08:08:27.609196509 +0000 UTC m=+6254.704354705" observedRunningTime="2025-10-10 08:08:31.663365985 +0000 UTC m=+6258.758524191" watchObservedRunningTime="2025-10-10 08:08:31.674354512 +0000 UTC m=+6258.769512708" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.691766 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528e4fef-6ff0-4f90-9ef0-5f50840bef69" path="/var/lib/kubelet/pods/528e4fef-6ff0-4f90-9ef0-5f50840bef69/volumes" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.692570 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" event={"ID":"97c2c1f4-1f4a-4f37-9435-80f0b49de473","Type":"ContainerStarted","Data":"81e293b591cdf3df41a01cf9418b966ad01b9ad00d401f348cd7cd4f09529540"} Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.692605 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.692646 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.696538 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" event={"ID":"70fba8c6-e26c-4600-857e-8728d6a7095e","Type":"ContainerStarted","Data":"e5c960920309104d1e843cbf1943b184edfaeb0980ea9d804b0a4bc2a33026b6"} Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.734326 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" podStartSLOduration=3.2844675150000002 podStartE2EDuration="11.73429337s" podCreationTimestamp="2025-10-10 08:08:20 +0000 UTC" firstStartedPulling="2025-10-10 08:08:22.063457485 +0000 UTC m=+6249.158615681" lastFinishedPulling="2025-10-10 08:08:30.51328332 +0000 UTC m=+6257.608441536" observedRunningTime="2025-10-10 08:08:31.695194993 +0000 UTC m=+6258.790353189" watchObservedRunningTime="2025-10-10 08:08:31.73429337 +0000 UTC m=+6258.829451566" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.737524 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg" podStartSLOduration=5.707285561 podStartE2EDuration="11.737511322s" podCreationTimestamp="2025-10-10 08:08:20 +0000 UTC" firstStartedPulling="2025-10-10 08:08:21.577210667 +0000 UTC m=+6248.672368863" lastFinishedPulling="2025-10-10 08:08:27.607436428 +0000 UTC m=+6254.702594624" observedRunningTime="2025-10-10 08:08:31.726185986 +0000 UTC m=+6258.821344182" watchObservedRunningTime="2025-10-10 08:08:31.737511322 +0000 UTC m=+6258.832669518" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.790013 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-xhnjq" podStartSLOduration=3.109438878 podStartE2EDuration="11.789975135s" podCreationTimestamp="2025-10-10 08:08:20 +0000 UTC" firstStartedPulling="2025-10-10 08:08:21.953247787 +0000 UTC m=+6249.048405983" lastFinishedPulling="2025-10-10 08:08:30.633784044 +0000 UTC m=+6257.728942240" observedRunningTime="2025-10-10 08:08:31.768106274 +0000 UTC m=+6258.863264490" watchObservedRunningTime="2025-10-10 08:08:31.789975135 +0000 UTC m=+6258.885133331" Oct 10 08:08:31 crc kubenswrapper[4822]: I1010 08:08:31.817147 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4" podStartSLOduration=5.974117543 podStartE2EDuration="11.817128408s" podCreationTimestamp="2025-10-10 08:08:20 +0000 UTC" firstStartedPulling="2025-10-10 08:08:21.763365283 +0000 UTC m=+6248.858523479" lastFinishedPulling="2025-10-10 08:08:27.606376148 +0000 UTC m=+6254.701534344" observedRunningTime="2025-10-10 08:08:31.815024317 +0000 UTC m=+6258.910182523" watchObservedRunningTime="2025-10-10 08:08:31.817128408 +0000 UTC m=+6258.912286594" Oct 10 08:08:32 crc kubenswrapper[4822]: I1010 08:08:32.990104 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgbwm"] Oct 10 08:08:32 crc kubenswrapper[4822]: I1010 08:08:32.991777 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bgbwm" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="registry-server" containerID="cri-o://a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe" gracePeriod=2 Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.398724 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.464640 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.561300 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.624155 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-utilities\") pod \"485bf337-e9e8-4609-af6a-2733ec9db8a7\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.624325 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-catalog-content\") pod \"485bf337-e9e8-4609-af6a-2733ec9db8a7\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.624459 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8prc\" (UniqueName: \"kubernetes.io/projected/485bf337-e9e8-4609-af6a-2733ec9db8a7-kube-api-access-l8prc\") pod \"485bf337-e9e8-4609-af6a-2733ec9db8a7\" (UID: \"485bf337-e9e8-4609-af6a-2733ec9db8a7\") " Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.624781 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-utilities" (OuterVolumeSpecName: "utilities") pod "485bf337-e9e8-4609-af6a-2733ec9db8a7" (UID: "485bf337-e9e8-4609-af6a-2733ec9db8a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.625157 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.632196 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485bf337-e9e8-4609-af6a-2733ec9db8a7-kube-api-access-l8prc" (OuterVolumeSpecName: "kube-api-access-l8prc") pod "485bf337-e9e8-4609-af6a-2733ec9db8a7" (UID: "485bf337-e9e8-4609-af6a-2733ec9db8a7"). InnerVolumeSpecName "kube-api-access-l8prc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.687040 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "485bf337-e9e8-4609-af6a-2733ec9db8a7" (UID: "485bf337-e9e8-4609-af6a-2733ec9db8a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.726340 4822 generic.go:334] "Generic (PLEG): container finished" podID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerID="a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe" exitCode=0 Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.726449 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerDied","Data":"a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe"} Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.726536 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgbwm" event={"ID":"485bf337-e9e8-4609-af6a-2733ec9db8a7","Type":"ContainerDied","Data":"5d15c082075afb8fa3f1c727af4efb7060313ab247712e313322381864c11a48"} Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.726560 4822 scope.go:117] "RemoveContainer" containerID="a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.726833 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgbwm" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.728458 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485bf337-e9e8-4609-af6a-2733ec9db8a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.728506 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8prc\" (UniqueName: \"kubernetes.io/projected/485bf337-e9e8-4609-af6a-2733ec9db8a7-kube-api-access-l8prc\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.760401 4822 scope.go:117] "RemoveContainer" containerID="9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.765844 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgbwm"] Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.788477 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bgbwm"] Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.807424 4822 scope.go:117] "RemoveContainer" containerID="9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.836948 4822 scope.go:117] "RemoveContainer" containerID="a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe" Oct 10 08:08:33 crc kubenswrapper[4822]: E1010 08:08:33.837457 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe\": container with ID starting with a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe not found: ID does not exist" containerID="a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.837513 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe"} err="failed to get container status \"a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe\": rpc error: code = NotFound desc = could not find container \"a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe\": container with ID starting with a66d2c0571b56adab5cd80b299a05d3589af24118bc3b6bffaa4c20656055bbe not found: ID does not exist" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.837559 4822 scope.go:117] "RemoveContainer" containerID="9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd" Oct 10 08:08:33 crc kubenswrapper[4822]: E1010 08:08:33.838189 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd\": container with ID starting with 9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd not found: ID does not exist" containerID="9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.838237 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd"} err="failed to get container status \"9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd\": rpc error: code = NotFound desc = could not find container \"9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd\": container with ID starting with 9248b3de44fb6d77240bbea96b729d52ca27ef9661f85ef125b38a9a5a760fdd not found: ID does not exist" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.838271 4822 scope.go:117] "RemoveContainer" containerID="9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005" Oct 10 08:08:33 crc kubenswrapper[4822]: E1010 08:08:33.838562 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005\": container with ID starting with 9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005 not found: ID does not exist" containerID="9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005" Oct 10 08:08:33 crc kubenswrapper[4822]: I1010 08:08:33.838593 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005"} err="failed to get container status \"9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005\": rpc error: code = NotFound desc = could not find container \"9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005\": container with ID starting with 9a0c298101c7e092348a8cd0925939eaeffcad701ae923475c1cc6242d9d4005 not found: ID does not exist" Oct 10 08:08:35 crc kubenswrapper[4822]: I1010 08:08:35.662324 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" path="/var/lib/kubelet/pods/485bf337-e9e8-4609-af6a-2733ec9db8a7/volumes" Oct 10 08:08:36 crc kubenswrapper[4822]: I1010 08:08:36.993582 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89fgb"] Oct 10 08:08:36 crc kubenswrapper[4822]: I1010 08:08:36.994050 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89fgb" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="registry-server" containerID="cri-o://2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922" gracePeriod=2 Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.531958 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.613625 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-catalog-content\") pod \"aca3e124-87df-4b67-a39a-f046eb941bfb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.613701 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw9sq\" (UniqueName: \"kubernetes.io/projected/aca3e124-87df-4b67-a39a-f046eb941bfb-kube-api-access-hw9sq\") pod \"aca3e124-87df-4b67-a39a-f046eb941bfb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.613864 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-utilities\") pod \"aca3e124-87df-4b67-a39a-f046eb941bfb\" (UID: \"aca3e124-87df-4b67-a39a-f046eb941bfb\") " Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.614604 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-utilities" (OuterVolumeSpecName: "utilities") pod "aca3e124-87df-4b67-a39a-f046eb941bfb" (UID: "aca3e124-87df-4b67-a39a-f046eb941bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.622168 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca3e124-87df-4b67-a39a-f046eb941bfb-kube-api-access-hw9sq" (OuterVolumeSpecName: "kube-api-access-hw9sq") pod "aca3e124-87df-4b67-a39a-f046eb941bfb" (UID: "aca3e124-87df-4b67-a39a-f046eb941bfb"). InnerVolumeSpecName "kube-api-access-hw9sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.666075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aca3e124-87df-4b67-a39a-f046eb941bfb" (UID: "aca3e124-87df-4b67-a39a-f046eb941bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.716793 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.716845 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw9sq\" (UniqueName: \"kubernetes.io/projected/aca3e124-87df-4b67-a39a-f046eb941bfb-kube-api-access-hw9sq\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.716864 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca3e124-87df-4b67-a39a-f046eb941bfb-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.764948 4822 generic.go:334] "Generic (PLEG): container finished" podID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerID="2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922" exitCode=0 Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.764994 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerDied","Data":"2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922"} Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.765025 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89fgb" event={"ID":"aca3e124-87df-4b67-a39a-f046eb941bfb","Type":"ContainerDied","Data":"b45c3a1688fc993ac6ab5462c6ab5183fdf5f60b652f3f0deb56e9afbc2760c8"} Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.765049 4822 scope.go:117] "RemoveContainer" containerID="2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.765177 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89fgb" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.794136 4822 scope.go:117] "RemoveContainer" containerID="83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.846201 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89fgb"] Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.870140 4822 scope.go:117] "RemoveContainer" containerID="e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.894570 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89fgb"] Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.935009 4822 scope.go:117] "RemoveContainer" containerID="2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922" Oct 10 08:08:37 crc kubenswrapper[4822]: E1010 08:08:37.938938 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922\": container with ID starting with 2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922 not found: ID does not exist" containerID="2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.938983 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922"} err="failed to get container status \"2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922\": rpc error: code = NotFound desc = could not find container \"2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922\": container with ID starting with 2f0d55984abb0d6b62586ea3a2ac09a883df2030b413d47c604bc54965330922 not found: ID does not exist" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.939011 4822 scope.go:117] "RemoveContainer" containerID="83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf" Oct 10 08:08:37 crc kubenswrapper[4822]: E1010 08:08:37.946992 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf\": container with ID starting with 83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf not found: ID does not exist" containerID="83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.947064 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf"} err="failed to get container status \"83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf\": rpc error: code = NotFound desc = could not find container \"83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf\": container with ID starting with 83f1887ab95106e49154a87067175f060b7e8dd37bea922a15185dc9f5c15ebf not found: ID does not exist" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.947100 4822 scope.go:117] "RemoveContainer" containerID="e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173" Oct 10 08:08:37 crc kubenswrapper[4822]: E1010 08:08:37.950980 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173\": container with ID starting with e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173 not found: ID does not exist" containerID="e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173" Oct 10 08:08:37 crc kubenswrapper[4822]: I1010 08:08:37.951036 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173"} err="failed to get container status \"e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173\": rpc error: code = NotFound desc = could not find container \"e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173\": container with ID starting with e73c860b8328fd5ffc4307393dd751ed74911ca88874d601e84221dcbdfd3173 not found: ID does not exist" Oct 10 08:08:39 crc kubenswrapper[4822]: I1010 08:08:39.664111 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" path="/var/lib/kubelet/pods/aca3e124-87df-4b67-a39a-f046eb941bfb/volumes" Oct 10 08:08:41 crc kubenswrapper[4822]: I1010 08:08:41.257938 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-slvtm" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.664353 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.665169 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" containerName="openstackclient" containerID="cri-o://4eb2c3ceac709c2141c39793b4ca227d98d49acaa877520386a9653a3d5f9e0a" gracePeriod=2 Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.673614 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721062 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721520 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" containerName="openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721540 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" containerName="openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721563 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="extract-content" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721570 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="extract-content" Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721589 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="extract-utilities" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721597 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="extract-utilities" Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721622 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="extract-utilities" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721630 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="extract-utilities" Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721645 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="extract-content" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721653 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="extract-content" Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721668 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="registry-server" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721675 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="registry-server" Oct 10 08:08:43 crc kubenswrapper[4822]: E1010 08:08:43.721690 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="registry-server" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721697 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="registry-server" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721910 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca3e124-87df-4b67-a39a-f046eb941bfb" containerName="registry-server" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721933 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" containerName="openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.721957 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="485bf337-e9e8-4609-af6a-2733ec9db8a7" containerName="registry-server" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.722822 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.737036 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.740793 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7d4\" (UniqueName: \"kubernetes.io/projected/a3817196-a9d9-404a-88be-36429ce51c70-kube-api-access-xc7d4\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.740848 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3817196-a9d9-404a-88be-36429ce51c70-openstack-config-secret\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.740930 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3817196-a9d9-404a-88be-36429ce51c70-openstack-config\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.774046 4822 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" podUID="a3817196-a9d9-404a-88be-36429ce51c70" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.842851 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7d4\" (UniqueName: \"kubernetes.io/projected/a3817196-a9d9-404a-88be-36429ce51c70-kube-api-access-xc7d4\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.842896 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3817196-a9d9-404a-88be-36429ce51c70-openstack-config-secret\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.842983 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3817196-a9d9-404a-88be-36429ce51c70-openstack-config\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.843959 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3817196-a9d9-404a-88be-36429ce51c70-openstack-config\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.857351 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3817196-a9d9-404a-88be-36429ce51c70-openstack-config-secret\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.887905 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7d4\" (UniqueName: \"kubernetes.io/projected/a3817196-a9d9-404a-88be-36429ce51c70-kube-api-access-xc7d4\") pod \"openstackclient\" (UID: \"a3817196-a9d9-404a-88be-36429ce51c70\") " pod="openstack/openstackclient" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.952792 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.954648 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.959709 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gdkvh" Oct 10 08:08:43 crc kubenswrapper[4822]: I1010 08:08:43.980733 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.046581 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b6g7\" (UniqueName: \"kubernetes.io/projected/60b95e5c-694d-4b82-b3db-2aa33ebd7189-kube-api-access-5b6g7\") pod \"kube-state-metrics-0\" (UID: \"60b95e5c-694d-4b82-b3db-2aa33ebd7189\") " pod="openstack/kube-state-metrics-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.075397 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.151144 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b6g7\" (UniqueName: \"kubernetes.io/projected/60b95e5c-694d-4b82-b3db-2aa33ebd7189-kube-api-access-5b6g7\") pod \"kube-state-metrics-0\" (UID: \"60b95e5c-694d-4b82-b3db-2aa33ebd7189\") " pod="openstack/kube-state-metrics-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.191728 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b6g7\" (UniqueName: \"kubernetes.io/projected/60b95e5c-694d-4b82-b3db-2aa33ebd7189-kube-api-access-5b6g7\") pod \"kube-state-metrics-0\" (UID: \"60b95e5c-694d-4b82-b3db-2aa33ebd7189\") " pod="openstack/kube-state-metrics-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.275436 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.822334 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.825626 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.843406 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.843846 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.843916 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-zwftj" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.844040 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.873366 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.973626 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a497eae8-7b84-4b5e-9916-2d07ccef9712-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.973682 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a497eae8-7b84-4b5e-9916-2d07ccef9712-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.973765 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6rt8\" (UniqueName: \"kubernetes.io/projected/a497eae8-7b84-4b5e-9916-2d07ccef9712-kube-api-access-z6rt8\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.974033 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a497eae8-7b84-4b5e-9916-2d07ccef9712-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.974063 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a497eae8-7b84-4b5e-9916-2d07ccef9712-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:44 crc kubenswrapper[4822]: I1010 08:08:44.974093 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a497eae8-7b84-4b5e-9916-2d07ccef9712-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.076315 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a497eae8-7b84-4b5e-9916-2d07ccef9712-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.077275 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a497eae8-7b84-4b5e-9916-2d07ccef9712-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.077418 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6rt8\" (UniqueName: \"kubernetes.io/projected/a497eae8-7b84-4b5e-9916-2d07ccef9712-kube-api-access-z6rt8\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.077517 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a497eae8-7b84-4b5e-9916-2d07ccef9712-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.077547 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a497eae8-7b84-4b5e-9916-2d07ccef9712-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.077577 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a497eae8-7b84-4b5e-9916-2d07ccef9712-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.078343 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a497eae8-7b84-4b5e-9916-2d07ccef9712-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.083475 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a497eae8-7b84-4b5e-9916-2d07ccef9712-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.084914 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a497eae8-7b84-4b5e-9916-2d07ccef9712-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.098285 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a497eae8-7b84-4b5e-9916-2d07ccef9712-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.109496 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a497eae8-7b84-4b5e-9916-2d07ccef9712-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.124756 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6rt8\" (UniqueName: \"kubernetes.io/projected/a497eae8-7b84-4b5e-9916-2d07ccef9712-kube-api-access-z6rt8\") pod \"alertmanager-metric-storage-0\" (UID: \"a497eae8-7b84-4b5e-9916-2d07ccef9712\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.155940 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.157962 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.290573 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.354438 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.357544 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.369546 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.369924 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.370140 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.370413 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tc9k4" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.374139 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.374290 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.393576 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487316 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-config\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487366 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487395 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-48546263-9a11-4929-9554-b873c6b4ca9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48546263-9a11-4929-9554-b873c6b4ca9a\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487425 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487455 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqz74\" (UniqueName: \"kubernetes.io/projected/600d3d37-326b-432d-8fb9-1eab04ab53e9-kube-api-access-jqz74\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487476 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/600d3d37-326b-432d-8fb9-1eab04ab53e9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487518 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/600d3d37-326b-432d-8fb9-1eab04ab53e9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.487613 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/600d3d37-326b-432d-8fb9-1eab04ab53e9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589414 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589476 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqz74\" (UniqueName: \"kubernetes.io/projected/600d3d37-326b-432d-8fb9-1eab04ab53e9-kube-api-access-jqz74\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/600d3d37-326b-432d-8fb9-1eab04ab53e9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589529 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/600d3d37-326b-432d-8fb9-1eab04ab53e9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589603 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/600d3d37-326b-432d-8fb9-1eab04ab53e9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589685 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-config\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589704 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.589726 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-48546263-9a11-4929-9554-b873c6b4ca9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48546263-9a11-4929-9554-b873c6b4ca9a\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.590798 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/600d3d37-326b-432d-8fb9-1eab04ab53e9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.595738 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.596888 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-config\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.598893 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/600d3d37-326b-432d-8fb9-1eab04ab53e9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.603027 4822 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.603087 4822 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-48546263-9a11-4929-9554-b873c6b4ca9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48546263-9a11-4929-9554-b873c6b4ca9a\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2aef177c6f664ba7e4240d526fc8ef01686587418d76dc3a307d9bfcf3e653ca/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.603202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/600d3d37-326b-432d-8fb9-1eab04ab53e9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.607358 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/600d3d37-326b-432d-8fb9-1eab04ab53e9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.662406 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqz74\" (UniqueName: \"kubernetes.io/projected/600d3d37-326b-432d-8fb9-1eab04ab53e9-kube-api-access-jqz74\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.778129 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-48546263-9a11-4929-9554-b873c6b4ca9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48546263-9a11-4929-9554-b873c6b4ca9a\") pod \"prometheus-metric-storage-0\" (UID: \"600d3d37-326b-432d-8fb9-1eab04ab53e9\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.865231 4822 generic.go:334] "Generic (PLEG): container finished" podID="102b4c01-91bf-4da7-a331-d4dad98f4d39" containerID="4eb2c3ceac709c2141c39793b4ca227d98d49acaa877520386a9653a3d5f9e0a" exitCode=137 Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.873088 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"60b95e5c-694d-4b82-b3db-2aa33ebd7189","Type":"ContainerStarted","Data":"80d1f508d3aee5b20b95e799e5a6cdbd92b2a5904780d0c9ffb62077cef02e67"} Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.874876 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a3817196-a9d9-404a-88be-36429ce51c70","Type":"ContainerStarted","Data":"928d941f0d5bcc01df4586e7825ca75d32c60a4f59b13423b023d2c061a4a201"} Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.874896 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a3817196-a9d9-404a-88be-36429ce51c70","Type":"ContainerStarted","Data":"6f9f5a934c7830ab3bc64c5483e61f6f80cac7a6e3579ce7e11af34ac8561672"} Oct 10 08:08:45 crc kubenswrapper[4822]: I1010 08:08:45.925183 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.925157031 podStartE2EDuration="2.925157031s" podCreationTimestamp="2025-10-10 08:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:08:45.91645305 +0000 UTC m=+6273.011611246" watchObservedRunningTime="2025-10-10 08:08:45.925157031 +0000 UTC m=+6273.020315247" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.003074 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.135103 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.307434 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.454913 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config\") pod \"102b4c01-91bf-4da7-a331-d4dad98f4d39\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.455399 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnflp\" (UniqueName: \"kubernetes.io/projected/102b4c01-91bf-4da7-a331-d4dad98f4d39-kube-api-access-fnflp\") pod \"102b4c01-91bf-4da7-a331-d4dad98f4d39\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.455485 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config-secret\") pod \"102b4c01-91bf-4da7-a331-d4dad98f4d39\" (UID: \"102b4c01-91bf-4da7-a331-d4dad98f4d39\") " Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.461020 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102b4c01-91bf-4da7-a331-d4dad98f4d39-kube-api-access-fnflp" (OuterVolumeSpecName: "kube-api-access-fnflp") pod "102b4c01-91bf-4da7-a331-d4dad98f4d39" (UID: "102b4c01-91bf-4da7-a331-d4dad98f4d39"). InnerVolumeSpecName "kube-api-access-fnflp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.484717 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "102b4c01-91bf-4da7-a331-d4dad98f4d39" (UID: "102b4c01-91bf-4da7-a331-d4dad98f4d39"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.521698 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "102b4c01-91bf-4da7-a331-d4dad98f4d39" (UID: "102b4c01-91bf-4da7-a331-d4dad98f4d39"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.558362 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.558400 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnflp\" (UniqueName: \"kubernetes.io/projected/102b4c01-91bf-4da7-a331-d4dad98f4d39-kube-api-access-fnflp\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.558414 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/102b4c01-91bf-4da7-a331-d4dad98f4d39-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 08:08:46 crc kubenswrapper[4822]: W1010 08:08:46.692099 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod600d3d37_326b_432d_8fb9_1eab04ab53e9.slice/crio-e8b63c2b446d243d6475b33d727f8790472084f39b293d2307f48075e6fbb97b WatchSource:0}: Error finding container e8b63c2b446d243d6475b33d727f8790472084f39b293d2307f48075e6fbb97b: Status 404 returned error can't find the container with id e8b63c2b446d243d6475b33d727f8790472084f39b293d2307f48075e6fbb97b Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.705787 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.891423 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"600d3d37-326b-432d-8fb9-1eab04ab53e9","Type":"ContainerStarted","Data":"e8b63c2b446d243d6475b33d727f8790472084f39b293d2307f48075e6fbb97b"} Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.895669 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a497eae8-7b84-4b5e-9916-2d07ccef9712","Type":"ContainerStarted","Data":"72e03290f9214072ddccd04997c7977258e85333c1de11c547dfd03189bfbcf0"} Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.898696 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"60b95e5c-694d-4b82-b3db-2aa33ebd7189","Type":"ContainerStarted","Data":"3498884cfdc203d91c313a1fb5407e43cd9279819a51d88ce62f778c66072255"} Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.899314 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.911563 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.912207 4822 scope.go:117] "RemoveContainer" containerID="4eb2c3ceac709c2141c39793b4ca227d98d49acaa877520386a9653a3d5f9e0a" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.932146 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.374736671 podStartE2EDuration="3.93211878s" podCreationTimestamp="2025-10-10 08:08:43 +0000 UTC" firstStartedPulling="2025-10-10 08:08:45.38505232 +0000 UTC m=+6272.480210516" lastFinishedPulling="2025-10-10 08:08:45.942434429 +0000 UTC m=+6273.037592625" observedRunningTime="2025-10-10 08:08:46.919119156 +0000 UTC m=+6274.014277352" watchObservedRunningTime="2025-10-10 08:08:46.93211878 +0000 UTC m=+6274.027276976" Oct 10 08:08:46 crc kubenswrapper[4822]: I1010 08:08:46.935210 4822 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" podUID="a3817196-a9d9-404a-88be-36429ce51c70" Oct 10 08:08:47 crc kubenswrapper[4822]: I1010 08:08:47.662544 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="102b4c01-91bf-4da7-a331-d4dad98f4d39" path="/var/lib/kubelet/pods/102b4c01-91bf-4da7-a331-d4dad98f4d39/volumes" Oct 10 08:08:51 crc kubenswrapper[4822]: I1010 08:08:51.961217 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a497eae8-7b84-4b5e-9916-2d07ccef9712","Type":"ContainerStarted","Data":"4189b3198ceb88cd06c7f8b0918d6836c749baac2c981e33657d58a5cd5184c0"} Oct 10 08:08:51 crc kubenswrapper[4822]: I1010 08:08:51.963152 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"600d3d37-326b-432d-8fb9-1eab04ab53e9","Type":"ContainerStarted","Data":"7c173b77fa62116657a5413b9acbe5daf9e922f4389e2f02126c2cb7e7f0c596"} Oct 10 08:08:54 crc kubenswrapper[4822]: I1010 08:08:54.280601 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 08:09:00 crc kubenswrapper[4822]: I1010 08:09:00.053739 4822 generic.go:334] "Generic (PLEG): container finished" podID="600d3d37-326b-432d-8fb9-1eab04ab53e9" containerID="7c173b77fa62116657a5413b9acbe5daf9e922f4389e2f02126c2cb7e7f0c596" exitCode=0 Oct 10 08:09:00 crc kubenswrapper[4822]: I1010 08:09:00.053830 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"600d3d37-326b-432d-8fb9-1eab04ab53e9","Type":"ContainerDied","Data":"7c173b77fa62116657a5413b9acbe5daf9e922f4389e2f02126c2cb7e7f0c596"} Oct 10 08:09:01 crc kubenswrapper[4822]: I1010 08:09:01.073541 4822 generic.go:334] "Generic (PLEG): container finished" podID="a497eae8-7b84-4b5e-9916-2d07ccef9712" containerID="4189b3198ceb88cd06c7f8b0918d6836c749baac2c981e33657d58a5cd5184c0" exitCode=0 Oct 10 08:09:01 crc kubenswrapper[4822]: I1010 08:09:01.073679 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a497eae8-7b84-4b5e-9916-2d07ccef9712","Type":"ContainerDied","Data":"4189b3198ceb88cd06c7f8b0918d6836c749baac2c981e33657d58a5cd5184c0"} Oct 10 08:09:06 crc kubenswrapper[4822]: I1010 08:09:06.140279 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"600d3d37-326b-432d-8fb9-1eab04ab53e9","Type":"ContainerStarted","Data":"519f38c2a877f5a8f6ffcf40057e6d8923ff0a381c98da02efed0b91ea9514da"} Oct 10 08:09:06 crc kubenswrapper[4822]: I1010 08:09:06.143253 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a497eae8-7b84-4b5e-9916-2d07ccef9712","Type":"ContainerStarted","Data":"db98803b13d366ae4ae3b956510207dd53025dd34e7cc84aedcc65e614c7d415"} Oct 10 08:09:10 crc kubenswrapper[4822]: I1010 08:09:10.195476 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"600d3d37-326b-432d-8fb9-1eab04ab53e9","Type":"ContainerStarted","Data":"434625ec353df37278a73e422ed383dc316d04f4930398bf0c63236f5e7871bc"} Oct 10 08:09:10 crc kubenswrapper[4822]: I1010 08:09:10.197922 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a497eae8-7b84-4b5e-9916-2d07ccef9712","Type":"ContainerStarted","Data":"fd58a56628dabff093a07dc212c58fa50af6a0ae2741fce871192508bb1780a4"} Oct 10 08:09:10 crc kubenswrapper[4822]: I1010 08:09:10.198439 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:09:10 crc kubenswrapper[4822]: I1010 08:09:10.203469 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:09:10 crc kubenswrapper[4822]: I1010 08:09:10.257682 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.142208281 podStartE2EDuration="26.257658711s" podCreationTimestamp="2025-10-10 08:08:44 +0000 UTC" firstStartedPulling="2025-10-10 08:08:46.17313628 +0000 UTC m=+6273.268294476" lastFinishedPulling="2025-10-10 08:09:05.2885867 +0000 UTC m=+6292.383744906" observedRunningTime="2025-10-10 08:09:10.225464513 +0000 UTC m=+6297.320622719" watchObservedRunningTime="2025-10-10 08:09:10.257658711 +0000 UTC m=+6297.352816897" Oct 10 08:09:13 crc kubenswrapper[4822]: I1010 08:09:13.234620 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"600d3d37-326b-432d-8fb9-1eab04ab53e9","Type":"ContainerStarted","Data":"d37842f050f52a1e9dd1b1c6e01d8595755abf13053b9de4e362bbf0df94ec4d"} Oct 10 08:09:13 crc kubenswrapper[4822]: I1010 08:09:13.263770 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.668090288 podStartE2EDuration="29.263749492s" podCreationTimestamp="2025-10-10 08:08:44 +0000 UTC" firstStartedPulling="2025-10-10 08:08:46.693942814 +0000 UTC m=+6273.789101010" lastFinishedPulling="2025-10-10 08:09:12.289602008 +0000 UTC m=+6299.384760214" observedRunningTime="2025-10-10 08:09:13.257748329 +0000 UTC m=+6300.352906525" watchObservedRunningTime="2025-10-10 08:09:13.263749492 +0000 UTC m=+6300.358907688" Oct 10 08:09:15 crc kubenswrapper[4822]: I1010 08:09:15.059068 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jj4c9"] Oct 10 08:09:15 crc kubenswrapper[4822]: I1010 08:09:15.070932 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jj4c9"] Oct 10 08:09:15 crc kubenswrapper[4822]: I1010 08:09:15.668725 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d83dd9e7-73fd-49a2-b77f-389ffc3b6f13" path="/var/lib/kubelet/pods/d83dd9e7-73fd-49a2-b77f-389ffc3b6f13/volumes" Oct 10 08:09:16 crc kubenswrapper[4822]: I1010 08:09:16.005002 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 10 08:09:16 crc kubenswrapper[4822]: I1010 08:09:16.005371 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 10 08:09:16 crc kubenswrapper[4822]: I1010 08:09:16.008023 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 10 08:09:16 crc kubenswrapper[4822]: I1010 08:09:16.278829 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.453130 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.457139 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.460885 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.472350 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.473972 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643556 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-log-httpd\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643596 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd4wq\" (UniqueName: \"kubernetes.io/projected/7ef5a824-bcdd-4657-b925-2c5c328fd483-kube-api-access-wd4wq\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643626 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643681 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-run-httpd\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643709 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-config-data\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643726 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-scripts\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.643778 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.745726 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-config-data\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.745795 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-scripts\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.745909 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.746032 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-log-httpd\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.746055 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd4wq\" (UniqueName: \"kubernetes.io/projected/7ef5a824-bcdd-4657-b925-2c5c328fd483-kube-api-access-wd4wq\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.746087 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.746769 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-log-httpd\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.747232 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-run-httpd\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.747540 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-run-httpd\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.759116 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-scripts\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.759156 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.759116 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.759235 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-config-data\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.776352 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd4wq\" (UniqueName: \"kubernetes.io/projected/7ef5a824-bcdd-4657-b925-2c5c328fd483-kube-api-access-wd4wq\") pod \"ceilometer-0\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " pod="openstack/ceilometer-0" Oct 10 08:09:19 crc kubenswrapper[4822]: I1010 08:09:19.792791 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:09:20 crc kubenswrapper[4822]: I1010 08:09:20.309358 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:09:20 crc kubenswrapper[4822]: W1010 08:09:20.320603 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef5a824_bcdd_4657_b925_2c5c328fd483.slice/crio-f2975309681e3adbd169e87e8069c0fc21cbdc617cae9716363016b4e57a075a WatchSource:0}: Error finding container f2975309681e3adbd169e87e8069c0fc21cbdc617cae9716363016b4e57a075a: Status 404 returned error can't find the container with id f2975309681e3adbd169e87e8069c0fc21cbdc617cae9716363016b4e57a075a Oct 10 08:09:21 crc kubenswrapper[4822]: I1010 08:09:21.320392 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerStarted","Data":"cb1be96061365564ddfd3c85220ddea0c2af34de306695d165f80bc5602ecad2"} Oct 10 08:09:21 crc kubenswrapper[4822]: I1010 08:09:21.321039 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerStarted","Data":"f2975309681e3adbd169e87e8069c0fc21cbdc617cae9716363016b4e57a075a"} Oct 10 08:09:22 crc kubenswrapper[4822]: I1010 08:09:22.334505 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerStarted","Data":"c3dabefd51f769325697f6e46523f55ca9d566c322a792789f1346ca8fac8722"} Oct 10 08:09:23 crc kubenswrapper[4822]: I1010 08:09:23.345203 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerStarted","Data":"c704b44cc482b85001b51ec1105f06bc2c659e22cd961082ea9815d688af417c"} Oct 10 08:09:24 crc kubenswrapper[4822]: I1010 08:09:24.360491 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerStarted","Data":"092089dfe28a7f187b3e030e588cc348b13b3152be1b1d3b88bf1206fd1228c6"} Oct 10 08:09:24 crc kubenswrapper[4822]: I1010 08:09:24.361101 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 08:09:24 crc kubenswrapper[4822]: I1010 08:09:24.394114 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.097058594 podStartE2EDuration="5.394096813s" podCreationTimestamp="2025-10-10 08:09:19 +0000 UTC" firstStartedPulling="2025-10-10 08:09:20.323920166 +0000 UTC m=+6307.419078362" lastFinishedPulling="2025-10-10 08:09:23.620958385 +0000 UTC m=+6310.716116581" observedRunningTime="2025-10-10 08:09:24.393198408 +0000 UTC m=+6311.488356684" watchObservedRunningTime="2025-10-10 08:09:24.394096813 +0000 UTC m=+6311.489255009" Oct 10 08:09:26 crc kubenswrapper[4822]: I1010 08:09:26.048162 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fd16-account-create-dhss6"] Oct 10 08:09:26 crc kubenswrapper[4822]: I1010 08:09:26.059770 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fd16-account-create-dhss6"] Oct 10 08:09:27 crc kubenswrapper[4822]: I1010 08:09:27.673084 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439c4dfe-0c23-4109-8c55-202a6d68fa41" path="/var/lib/kubelet/pods/439c4dfe-0c23-4109-8c55-202a6d68fa41/volumes" Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.072670 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-mbhwj"] Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.074732 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.087159 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mbhwj"] Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.200468 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8cmb\" (UniqueName: \"kubernetes.io/projected/b93d3b39-e873-4822-98ab-c02d72ffc7a1-kube-api-access-p8cmb\") pod \"aodh-db-create-mbhwj\" (UID: \"b93d3b39-e873-4822-98ab-c02d72ffc7a1\") " pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.303036 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8cmb\" (UniqueName: \"kubernetes.io/projected/b93d3b39-e873-4822-98ab-c02d72ffc7a1-kube-api-access-p8cmb\") pod \"aodh-db-create-mbhwj\" (UID: \"b93d3b39-e873-4822-98ab-c02d72ffc7a1\") " pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.322889 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8cmb\" (UniqueName: \"kubernetes.io/projected/b93d3b39-e873-4822-98ab-c02d72ffc7a1-kube-api-access-p8cmb\") pod \"aodh-db-create-mbhwj\" (UID: \"b93d3b39-e873-4822-98ab-c02d72ffc7a1\") " pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.405643 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:30 crc kubenswrapper[4822]: I1010 08:09:30.916525 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mbhwj"] Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.188951 4822 scope.go:117] "RemoveContainer" containerID="6ca8f626cd853496ad9a6f935f13458a46f259efba30be5c170966de087e4dff" Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.231489 4822 scope.go:117] "RemoveContainer" containerID="fcf6e3f5439f7beaceadd9da96a2d5962cb486d6849e01c7cfca584bec68ec36" Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.292684 4822 scope.go:117] "RemoveContainer" containerID="352be9c9d2e6f65fc4e5a6dc095ad8f858070b8e1964a64d04e36b73a81c8bea" Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.326825 4822 scope.go:117] "RemoveContainer" containerID="793af02f6b261990a5b68b6f2a67898a2978511b72f67751c0058a9ff50d5f12" Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.435172 4822 generic.go:334] "Generic (PLEG): container finished" podID="b93d3b39-e873-4822-98ab-c02d72ffc7a1" containerID="0fd944e0d90dd2d6c89f9ef2a20c073169a61b490982190dcc56ca2c9a20f739" exitCode=0 Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.435238 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mbhwj" event={"ID":"b93d3b39-e873-4822-98ab-c02d72ffc7a1","Type":"ContainerDied","Data":"0fd944e0d90dd2d6c89f9ef2a20c073169a61b490982190dcc56ca2c9a20f739"} Oct 10 08:09:31 crc kubenswrapper[4822]: I1010 08:09:31.435296 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mbhwj" event={"ID":"b93d3b39-e873-4822-98ab-c02d72ffc7a1","Type":"ContainerStarted","Data":"0c97ac8d7f38ba78cd5911492d94bad712aec49e360d5a494742cf42ede1cd87"} Oct 10 08:09:32 crc kubenswrapper[4822]: I1010 08:09:32.948880 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:32 crc kubenswrapper[4822]: I1010 08:09:32.963589 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8cmb\" (UniqueName: \"kubernetes.io/projected/b93d3b39-e873-4822-98ab-c02d72ffc7a1-kube-api-access-p8cmb\") pod \"b93d3b39-e873-4822-98ab-c02d72ffc7a1\" (UID: \"b93d3b39-e873-4822-98ab-c02d72ffc7a1\") " Oct 10 08:09:32 crc kubenswrapper[4822]: I1010 08:09:32.976489 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93d3b39-e873-4822-98ab-c02d72ffc7a1-kube-api-access-p8cmb" (OuterVolumeSpecName: "kube-api-access-p8cmb") pod "b93d3b39-e873-4822-98ab-c02d72ffc7a1" (UID: "b93d3b39-e873-4822-98ab-c02d72ffc7a1"). InnerVolumeSpecName "kube-api-access-p8cmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:33 crc kubenswrapper[4822]: I1010 08:09:33.067349 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8cmb\" (UniqueName: \"kubernetes.io/projected/b93d3b39-e873-4822-98ab-c02d72ffc7a1-kube-api-access-p8cmb\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:33 crc kubenswrapper[4822]: I1010 08:09:33.465042 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mbhwj" event={"ID":"b93d3b39-e873-4822-98ab-c02d72ffc7a1","Type":"ContainerDied","Data":"0c97ac8d7f38ba78cd5911492d94bad712aec49e360d5a494742cf42ede1cd87"} Oct 10 08:09:33 crc kubenswrapper[4822]: I1010 08:09:33.465085 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c97ac8d7f38ba78cd5911492d94bad712aec49e360d5a494742cf42ede1cd87" Oct 10 08:09:33 crc kubenswrapper[4822]: I1010 08:09:33.465144 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mbhwj" Oct 10 08:09:33 crc kubenswrapper[4822]: E1010 08:09:33.579751 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb93d3b39_e873_4822_98ab_c02d72ffc7a1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb93d3b39_e873_4822_98ab_c02d72ffc7a1.slice/crio-0c97ac8d7f38ba78cd5911492d94bad712aec49e360d5a494742cf42ede1cd87\": RecentStats: unable to find data in memory cache]" Oct 10 08:09:34 crc kubenswrapper[4822]: I1010 08:09:34.045595 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-pjxfs"] Oct 10 08:09:34 crc kubenswrapper[4822]: I1010 08:09:34.056514 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-pjxfs"] Oct 10 08:09:35 crc kubenswrapper[4822]: I1010 08:09:35.670271 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1389c824-6a9a-4194-a74a-6b85d381a3df" path="/var/lib/kubelet/pods/1389c824-6a9a-4194-a74a-6b85d381a3df/volumes" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.213408 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-520f-account-create-lmc9m"] Oct 10 08:09:40 crc kubenswrapper[4822]: E1010 08:09:40.214472 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93d3b39-e873-4822-98ab-c02d72ffc7a1" containerName="mariadb-database-create" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.214489 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93d3b39-e873-4822-98ab-c02d72ffc7a1" containerName="mariadb-database-create" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.214772 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93d3b39-e873-4822-98ab-c02d72ffc7a1" containerName="mariadb-database-create" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.215742 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.220754 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.227996 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-520f-account-create-lmc9m"] Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.272123 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdvw\" (UniqueName: \"kubernetes.io/projected/30dfa431-47a6-485e-b4df-db5ff05df8e4-kube-api-access-rtdvw\") pod \"aodh-520f-account-create-lmc9m\" (UID: \"30dfa431-47a6-485e-b4df-db5ff05df8e4\") " pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.375769 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdvw\" (UniqueName: \"kubernetes.io/projected/30dfa431-47a6-485e-b4df-db5ff05df8e4-kube-api-access-rtdvw\") pod \"aodh-520f-account-create-lmc9m\" (UID: \"30dfa431-47a6-485e-b4df-db5ff05df8e4\") " pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.401769 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdvw\" (UniqueName: \"kubernetes.io/projected/30dfa431-47a6-485e-b4df-db5ff05df8e4-kube-api-access-rtdvw\") pod \"aodh-520f-account-create-lmc9m\" (UID: \"30dfa431-47a6-485e-b4df-db5ff05df8e4\") " pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:40 crc kubenswrapper[4822]: I1010 08:09:40.552601 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:41 crc kubenswrapper[4822]: I1010 08:09:41.025685 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-520f-account-create-lmc9m"] Oct 10 08:09:41 crc kubenswrapper[4822]: I1010 08:09:41.568587 4822 generic.go:334] "Generic (PLEG): container finished" podID="30dfa431-47a6-485e-b4df-db5ff05df8e4" containerID="fa194e3858649ca2a048f6f1e4669480bc87e2b381ea8f2927dbbd133b4a7fe4" exitCode=0 Oct 10 08:09:41 crc kubenswrapper[4822]: I1010 08:09:41.568664 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-520f-account-create-lmc9m" event={"ID":"30dfa431-47a6-485e-b4df-db5ff05df8e4","Type":"ContainerDied","Data":"fa194e3858649ca2a048f6f1e4669480bc87e2b381ea8f2927dbbd133b4a7fe4"} Oct 10 08:09:41 crc kubenswrapper[4822]: I1010 08:09:41.568707 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-520f-account-create-lmc9m" event={"ID":"30dfa431-47a6-485e-b4df-db5ff05df8e4","Type":"ContainerStarted","Data":"45bfa57b6dee5cd0ed799921e70f986386a9a39c21c852c47377e01009143d59"} Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.006787 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.038433 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtdvw\" (UniqueName: \"kubernetes.io/projected/30dfa431-47a6-485e-b4df-db5ff05df8e4-kube-api-access-rtdvw\") pod \"30dfa431-47a6-485e-b4df-db5ff05df8e4\" (UID: \"30dfa431-47a6-485e-b4df-db5ff05df8e4\") " Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.045470 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30dfa431-47a6-485e-b4df-db5ff05df8e4-kube-api-access-rtdvw" (OuterVolumeSpecName: "kube-api-access-rtdvw") pod "30dfa431-47a6-485e-b4df-db5ff05df8e4" (UID: "30dfa431-47a6-485e-b4df-db5ff05df8e4"). InnerVolumeSpecName "kube-api-access-rtdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.141512 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtdvw\" (UniqueName: \"kubernetes.io/projected/30dfa431-47a6-485e-b4df-db5ff05df8e4-kube-api-access-rtdvw\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.590167 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-520f-account-create-lmc9m" event={"ID":"30dfa431-47a6-485e-b4df-db5ff05df8e4","Type":"ContainerDied","Data":"45bfa57b6dee5cd0ed799921e70f986386a9a39c21c852c47377e01009143d59"} Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.590246 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-520f-account-create-lmc9m" Oct 10 08:09:43 crc kubenswrapper[4822]: I1010 08:09:43.590256 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45bfa57b6dee5cd0ed799921e70f986386a9a39c21c852c47377e01009143d59" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.648536 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-zvqf5"] Oct 10 08:09:45 crc kubenswrapper[4822]: E1010 08:09:45.651206 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30dfa431-47a6-485e-b4df-db5ff05df8e4" containerName="mariadb-account-create" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.651303 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="30dfa431-47a6-485e-b4df-db5ff05df8e4" containerName="mariadb-account-create" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.651657 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="30dfa431-47a6-485e-b4df-db5ff05df8e4" containerName="mariadb-account-create" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.652898 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.658301 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.658527 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-q5sjr" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.658621 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.685688 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zvqf5"] Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.699292 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-combined-ca-bundle\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.699361 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw7j9\" (UniqueName: \"kubernetes.io/projected/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-kube-api-access-nw7j9\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.699407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-config-data\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.699494 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-scripts\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.801601 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-scripts\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.801790 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-combined-ca-bundle\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.801830 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw7j9\" (UniqueName: \"kubernetes.io/projected/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-kube-api-access-nw7j9\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.801863 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-config-data\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.807663 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-scripts\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.807699 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-combined-ca-bundle\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.816785 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-config-data\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.822475 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw7j9\" (UniqueName: \"kubernetes.io/projected/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-kube-api-access-nw7j9\") pod \"aodh-db-sync-zvqf5\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:45 crc kubenswrapper[4822]: I1010 08:09:45.995838 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:46 crc kubenswrapper[4822]: I1010 08:09:46.524168 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zvqf5"] Oct 10 08:09:46 crc kubenswrapper[4822]: W1010 08:09:46.527900 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c4a48ef_7c7b_460b_b968_8bdb160b3fa0.slice/crio-32cefc1cf5a417e1744bfac047bbf828ca39af9faebf038310437e7bc1d86a2e WatchSource:0}: Error finding container 32cefc1cf5a417e1744bfac047bbf828ca39af9faebf038310437e7bc1d86a2e: Status 404 returned error can't find the container with id 32cefc1cf5a417e1744bfac047bbf828ca39af9faebf038310437e7bc1d86a2e Oct 10 08:09:46 crc kubenswrapper[4822]: I1010 08:09:46.634009 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zvqf5" event={"ID":"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0","Type":"ContainerStarted","Data":"32cefc1cf5a417e1744bfac047bbf828ca39af9faebf038310437e7bc1d86a2e"} Oct 10 08:09:49 crc kubenswrapper[4822]: I1010 08:09:49.800979 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 08:09:51 crc kubenswrapper[4822]: I1010 08:09:51.685539 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zvqf5" event={"ID":"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0","Type":"ContainerStarted","Data":"8385f783def928f15d69fcc41b02fda83ad97d1a3db75bba1e05365a9f30141e"} Oct 10 08:09:51 crc kubenswrapper[4822]: I1010 08:09:51.725787 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-zvqf5" podStartSLOduration=2.697568707 podStartE2EDuration="6.725762673s" podCreationTimestamp="2025-10-10 08:09:45 +0000 UTC" firstStartedPulling="2025-10-10 08:09:46.539275944 +0000 UTC m=+6333.634434150" lastFinishedPulling="2025-10-10 08:09:50.56746992 +0000 UTC m=+6337.662628116" observedRunningTime="2025-10-10 08:09:51.721445198 +0000 UTC m=+6338.816603434" watchObservedRunningTime="2025-10-10 08:09:51.725762673 +0000 UTC m=+6338.820920879" Oct 10 08:09:53 crc kubenswrapper[4822]: I1010 08:09:53.712729 4822 generic.go:334] "Generic (PLEG): container finished" podID="1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" containerID="8385f783def928f15d69fcc41b02fda83ad97d1a3db75bba1e05365a9f30141e" exitCode=0 Oct 10 08:09:53 crc kubenswrapper[4822]: I1010 08:09:53.712873 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zvqf5" event={"ID":"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0","Type":"ContainerDied","Data":"8385f783def928f15d69fcc41b02fda83ad97d1a3db75bba1e05365a9f30141e"} Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.186380 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.314012 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw7j9\" (UniqueName: \"kubernetes.io/projected/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-kube-api-access-nw7j9\") pod \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.314076 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-config-data\") pod \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.314154 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-combined-ca-bundle\") pod \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.314213 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-scripts\") pod \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\" (UID: \"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0\") " Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.322148 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-scripts" (OuterVolumeSpecName: "scripts") pod "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" (UID: "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.322361 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-kube-api-access-nw7j9" (OuterVolumeSpecName: "kube-api-access-nw7j9") pod "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" (UID: "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0"). InnerVolumeSpecName "kube-api-access-nw7j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.343765 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-config-data" (OuterVolumeSpecName: "config-data") pod "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" (UID: "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.345574 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" (UID: "1c4a48ef-7c7b-460b-b968-8bdb160b3fa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.417222 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw7j9\" (UniqueName: \"kubernetes.io/projected/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-kube-api-access-nw7j9\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.417266 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.417280 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.417291 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.766408 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zvqf5" event={"ID":"1c4a48ef-7c7b-460b-b968-8bdb160b3fa0","Type":"ContainerDied","Data":"32cefc1cf5a417e1744bfac047bbf828ca39af9faebf038310437e7bc1d86a2e"} Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.766485 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zvqf5" Oct 10 08:09:55 crc kubenswrapper[4822]: I1010 08:09:55.768284 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32cefc1cf5a417e1744bfac047bbf828ca39af9faebf038310437e7bc1d86a2e" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.101230 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 10 08:10:00 crc kubenswrapper[4822]: E1010 08:10:00.102324 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" containerName="aodh-db-sync" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.102345 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" containerName="aodh-db-sync" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.102627 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" containerName="aodh-db-sync" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.105143 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.109552 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.109617 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.109552 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-q5sjr" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.118588 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.153157 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-scripts\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.153685 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.153757 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bglbl\" (UniqueName: \"kubernetes.io/projected/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-kube-api-access-bglbl\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.153844 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-config-data\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.262768 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.262885 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bglbl\" (UniqueName: \"kubernetes.io/projected/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-kube-api-access-bglbl\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.262939 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-config-data\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.263053 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-scripts\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.296738 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bglbl\" (UniqueName: \"kubernetes.io/projected/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-kube-api-access-bglbl\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.298468 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-scripts\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.299870 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-config-data\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.315251 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67\") " pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.436779 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:10:00 crc kubenswrapper[4822]: I1010 08:10:00.931273 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 10 08:10:01 crc kubenswrapper[4822]: I1010 08:10:01.339110 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:10:01 crc kubenswrapper[4822]: I1010 08:10:01.339423 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:10:01 crc kubenswrapper[4822]: I1010 08:10:01.830449 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67","Type":"ContainerStarted","Data":"35072610b91effb7bbf03478f385dbdbd550d728726b4f495bac40c7c9b13812"} Oct 10 08:10:02 crc kubenswrapper[4822]: I1010 08:10:02.820741 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:02 crc kubenswrapper[4822]: I1010 08:10:02.821312 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-central-agent" containerID="cri-o://cb1be96061365564ddfd3c85220ddea0c2af34de306695d165f80bc5602ecad2" gracePeriod=30 Oct 10 08:10:02 crc kubenswrapper[4822]: I1010 08:10:02.821427 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-notification-agent" containerID="cri-o://c3dabefd51f769325697f6e46523f55ca9d566c322a792789f1346ca8fac8722" gracePeriod=30 Oct 10 08:10:02 crc kubenswrapper[4822]: I1010 08:10:02.821441 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="sg-core" containerID="cri-o://c704b44cc482b85001b51ec1105f06bc2c659e22cd961082ea9815d688af417c" gracePeriod=30 Oct 10 08:10:02 crc kubenswrapper[4822]: I1010 08:10:02.821592 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="proxy-httpd" containerID="cri-o://092089dfe28a7f187b3e030e588cc348b13b3152be1b1d3b88bf1206fd1228c6" gracePeriod=30 Oct 10 08:10:02 crc kubenswrapper[4822]: I1010 08:10:02.845471 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67","Type":"ContainerStarted","Data":"4c5daf82a8360e469b70f13416dfe814532ce50b400a2ea275d0b0594f97a8b7"} Oct 10 08:10:03 crc kubenswrapper[4822]: I1010 08:10:03.863370 4822 generic.go:334] "Generic (PLEG): container finished" podID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerID="092089dfe28a7f187b3e030e588cc348b13b3152be1b1d3b88bf1206fd1228c6" exitCode=0 Oct 10 08:10:03 crc kubenswrapper[4822]: I1010 08:10:03.863641 4822 generic.go:334] "Generic (PLEG): container finished" podID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerID="c704b44cc482b85001b51ec1105f06bc2c659e22cd961082ea9815d688af417c" exitCode=2 Oct 10 08:10:03 crc kubenswrapper[4822]: I1010 08:10:03.863651 4822 generic.go:334] "Generic (PLEG): container finished" podID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerID="cb1be96061365564ddfd3c85220ddea0c2af34de306695d165f80bc5602ecad2" exitCode=0 Oct 10 08:10:03 crc kubenswrapper[4822]: I1010 08:10:03.863448 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerDied","Data":"092089dfe28a7f187b3e030e588cc348b13b3152be1b1d3b88bf1206fd1228c6"} Oct 10 08:10:03 crc kubenswrapper[4822]: I1010 08:10:03.863687 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerDied","Data":"c704b44cc482b85001b51ec1105f06bc2c659e22cd961082ea9815d688af417c"} Oct 10 08:10:03 crc kubenswrapper[4822]: I1010 08:10:03.863701 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerDied","Data":"cb1be96061365564ddfd3c85220ddea0c2af34de306695d165f80bc5602ecad2"} Oct 10 08:10:04 crc kubenswrapper[4822]: I1010 08:10:04.874688 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67","Type":"ContainerStarted","Data":"f19b2393056028f81647620acedb4c8034a06bf0562136ad80566ab66e8f86fa"} Oct 10 08:10:04 crc kubenswrapper[4822]: I1010 08:10:04.879034 4822 generic.go:334] "Generic (PLEG): container finished" podID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerID="c3dabefd51f769325697f6e46523f55ca9d566c322a792789f1346ca8fac8722" exitCode=0 Oct 10 08:10:04 crc kubenswrapper[4822]: I1010 08:10:04.879075 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerDied","Data":"c3dabefd51f769325697f6e46523f55ca9d566c322a792789f1346ca8fac8722"} Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.092810 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.196444 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-config-data\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.196532 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-combined-ca-bundle\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.196826 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-run-httpd\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.196919 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-sg-core-conf-yaml\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.196972 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-scripts\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.197576 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.197589 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-log-httpd\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.197759 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd4wq\" (UniqueName: \"kubernetes.io/projected/7ef5a824-bcdd-4657-b925-2c5c328fd483-kube-api-access-wd4wq\") pod \"7ef5a824-bcdd-4657-b925-2c5c328fd483\" (UID: \"7ef5a824-bcdd-4657-b925-2c5c328fd483\") " Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.198124 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.198840 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.198860 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ef5a824-bcdd-4657-b925-2c5c328fd483-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.209456 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef5a824-bcdd-4657-b925-2c5c328fd483-kube-api-access-wd4wq" (OuterVolumeSpecName: "kube-api-access-wd4wq") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "kube-api-access-wd4wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.222562 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-scripts" (OuterVolumeSpecName: "scripts") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.298151 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.302502 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd4wq\" (UniqueName: \"kubernetes.io/projected/7ef5a824-bcdd-4657-b925-2c5c328fd483-kube-api-access-wd4wq\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.302531 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.302552 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.325197 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.380912 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-config-data" (OuterVolumeSpecName: "config-data") pod "7ef5a824-bcdd-4657-b925-2c5c328fd483" (UID: "7ef5a824-bcdd-4657-b925-2c5c328fd483"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.404422 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.404455 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a824-bcdd-4657-b925-2c5c328fd483-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.907508 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ef5a824-bcdd-4657-b925-2c5c328fd483","Type":"ContainerDied","Data":"f2975309681e3adbd169e87e8069c0fc21cbdc617cae9716363016b4e57a075a"} Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.908393 4822 scope.go:117] "RemoveContainer" containerID="092089dfe28a7f187b3e030e588cc348b13b3152be1b1d3b88bf1206fd1228c6" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.907599 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.945124 4822 scope.go:117] "RemoveContainer" containerID="c704b44cc482b85001b51ec1105f06bc2c659e22cd961082ea9815d688af417c" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.950641 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.962850 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.973873 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:05 crc kubenswrapper[4822]: E1010 08:10:05.974300 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-central-agent" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974319 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-central-agent" Oct 10 08:10:05 crc kubenswrapper[4822]: E1010 08:10:05.974333 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="sg-core" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974339 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="sg-core" Oct 10 08:10:05 crc kubenswrapper[4822]: E1010 08:10:05.974368 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="proxy-httpd" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974375 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="proxy-httpd" Oct 10 08:10:05 crc kubenswrapper[4822]: E1010 08:10:05.974399 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-notification-agent" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974405 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-notification-agent" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974597 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-notification-agent" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974614 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="proxy-httpd" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974626 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="ceilometer-central-agent" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.974644 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" containerName="sg-core" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.978197 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.982969 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.983085 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 08:10:05 crc kubenswrapper[4822]: I1010 08:10:05.988132 4822 scope.go:117] "RemoveContainer" containerID="c3dabefd51f769325697f6e46523f55ca9d566c322a792789f1346ca8fac8722" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:05.990559 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.088785 4822 scope.go:117] "RemoveContainer" containerID="cb1be96061365564ddfd3c85220ddea0c2af34de306695d165f80bc5602ecad2" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.122672 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.122713 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-config-data\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.122735 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.122758 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcln5\" (UniqueName: \"kubernetes.io/projected/dc2ca913-7c75-44f2-968d-f3aa14024b37-kube-api-access-tcln5\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.123055 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-run-httpd\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.123173 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-log-httpd\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.123609 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-scripts\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.225741 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-log-httpd\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.225920 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-scripts\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.225971 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.225994 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-config-data\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.226012 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.226031 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcln5\" (UniqueName: \"kubernetes.io/projected/dc2ca913-7c75-44f2-968d-f3aa14024b37-kube-api-access-tcln5\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.226075 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-run-httpd\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.226476 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-run-httpd\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.226942 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-log-httpd\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.234283 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-scripts\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.241956 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.243617 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-config-data\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.255581 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcln5\" (UniqueName: \"kubernetes.io/projected/dc2ca913-7c75-44f2-968d-f3aa14024b37-kube-api-access-tcln5\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.260524 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.377572 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.863956 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:06 crc kubenswrapper[4822]: W1010 08:10:06.870999 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc2ca913_7c75_44f2_968d_f3aa14024b37.slice/crio-28cfaa77bc70330a35e61755e1737ebf7404c031d96b8e431b5d43667f1bf52c WatchSource:0}: Error finding container 28cfaa77bc70330a35e61755e1737ebf7404c031d96b8e431b5d43667f1bf52c: Status 404 returned error can't find the container with id 28cfaa77bc70330a35e61755e1737ebf7404c031d96b8e431b5d43667f1bf52c Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.931932 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67","Type":"ContainerStarted","Data":"2281d89f74c73d0ecccfa058d6c1fefcd59656f2855194d8f91ebd7dc56ff982"} Oct 10 08:10:06 crc kubenswrapper[4822]: I1010 08:10:06.942959 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerStarted","Data":"28cfaa77bc70330a35e61755e1737ebf7404c031d96b8e431b5d43667f1bf52c"} Oct 10 08:10:07 crc kubenswrapper[4822]: I1010 08:10:07.697288 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef5a824-bcdd-4657-b925-2c5c328fd483" path="/var/lib/kubelet/pods/7ef5a824-bcdd-4657-b925-2c5c328fd483/volumes" Oct 10 08:10:08 crc kubenswrapper[4822]: I1010 08:10:08.970403 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67","Type":"ContainerStarted","Data":"b76a0fd777442d873cd4b566975407ff9b1c8f845953fb26a44c3ebf07e31aba"} Oct 10 08:10:08 crc kubenswrapper[4822]: I1010 08:10:08.973564 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerStarted","Data":"e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f"} Oct 10 08:10:08 crc kubenswrapper[4822]: I1010 08:10:08.973598 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerStarted","Data":"74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf"} Oct 10 08:10:08 crc kubenswrapper[4822]: I1010 08:10:08.994250 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.058636963 podStartE2EDuration="8.994230816s" podCreationTimestamp="2025-10-10 08:10:00 +0000 UTC" firstStartedPulling="2025-10-10 08:10:00.942795215 +0000 UTC m=+6348.037953421" lastFinishedPulling="2025-10-10 08:10:07.878389078 +0000 UTC m=+6354.973547274" observedRunningTime="2025-10-10 08:10:08.990649683 +0000 UTC m=+6356.085807909" watchObservedRunningTime="2025-10-10 08:10:08.994230816 +0000 UTC m=+6356.089389012" Oct 10 08:10:09 crc kubenswrapper[4822]: I1010 08:10:09.989676 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerStarted","Data":"c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d"} Oct 10 08:10:16 crc kubenswrapper[4822]: I1010 08:10:16.057612 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerStarted","Data":"1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33"} Oct 10 08:10:16 crc kubenswrapper[4822]: I1010 08:10:16.058387 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 08:10:16 crc kubenswrapper[4822]: I1010 08:10:16.085838 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.841939098 podStartE2EDuration="11.085821985s" podCreationTimestamp="2025-10-10 08:10:05 +0000 UTC" firstStartedPulling="2025-10-10 08:10:06.874533309 +0000 UTC m=+6353.969691505" lastFinishedPulling="2025-10-10 08:10:15.118416196 +0000 UTC m=+6362.213574392" observedRunningTime="2025-10-10 08:10:16.082274813 +0000 UTC m=+6363.177433019" watchObservedRunningTime="2025-10-10 08:10:16.085821985 +0000 UTC m=+6363.180980191" Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.176590 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-9fn4l"] Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.179106 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.193109 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-9fn4l"] Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.238128 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5mz\" (UniqueName: \"kubernetes.io/projected/130de137-8831-4c2b-9188-aea0f9950b9e-kube-api-access-kv5mz\") pod \"manila-db-create-9fn4l\" (UID: \"130de137-8831-4c2b-9188-aea0f9950b9e\") " pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.341478 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5mz\" (UniqueName: \"kubernetes.io/projected/130de137-8831-4c2b-9188-aea0f9950b9e-kube-api-access-kv5mz\") pod \"manila-db-create-9fn4l\" (UID: \"130de137-8831-4c2b-9188-aea0f9950b9e\") " pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.391379 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5mz\" (UniqueName: \"kubernetes.io/projected/130de137-8831-4c2b-9188-aea0f9950b9e-kube-api-access-kv5mz\") pod \"manila-db-create-9fn4l\" (UID: \"130de137-8831-4c2b-9188-aea0f9950b9e\") " pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:20 crc kubenswrapper[4822]: I1010 08:10:20.504186 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:21 crc kubenswrapper[4822]: I1010 08:10:21.054071 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-9fn4l"] Oct 10 08:10:21 crc kubenswrapper[4822]: I1010 08:10:21.104995 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-9fn4l" event={"ID":"130de137-8831-4c2b-9188-aea0f9950b9e","Type":"ContainerStarted","Data":"f4e8a5154e8ba45a329dd8314acee4a635e1652f47edf4bad9e1ac2e2536e3cb"} Oct 10 08:10:22 crc kubenswrapper[4822]: I1010 08:10:22.122968 4822 generic.go:334] "Generic (PLEG): container finished" podID="130de137-8831-4c2b-9188-aea0f9950b9e" containerID="7c51ae10045f0c1a227389a6451799108c35c721779dce89c46b50bef8a7a4d7" exitCode=0 Oct 10 08:10:22 crc kubenswrapper[4822]: I1010 08:10:22.123135 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-9fn4l" event={"ID":"130de137-8831-4c2b-9188-aea0f9950b9e","Type":"ContainerDied","Data":"7c51ae10045f0c1a227389a6451799108c35c721779dce89c46b50bef8a7a4d7"} Oct 10 08:10:23 crc kubenswrapper[4822]: I1010 08:10:23.529855 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:23 crc kubenswrapper[4822]: I1010 08:10:23.614375 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5mz\" (UniqueName: \"kubernetes.io/projected/130de137-8831-4c2b-9188-aea0f9950b9e-kube-api-access-kv5mz\") pod \"130de137-8831-4c2b-9188-aea0f9950b9e\" (UID: \"130de137-8831-4c2b-9188-aea0f9950b9e\") " Oct 10 08:10:23 crc kubenswrapper[4822]: I1010 08:10:23.619606 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130de137-8831-4c2b-9188-aea0f9950b9e-kube-api-access-kv5mz" (OuterVolumeSpecName: "kube-api-access-kv5mz") pod "130de137-8831-4c2b-9188-aea0f9950b9e" (UID: "130de137-8831-4c2b-9188-aea0f9950b9e"). InnerVolumeSpecName "kube-api-access-kv5mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:23 crc kubenswrapper[4822]: I1010 08:10:23.718210 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv5mz\" (UniqueName: \"kubernetes.io/projected/130de137-8831-4c2b-9188-aea0f9950b9e-kube-api-access-kv5mz\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:24 crc kubenswrapper[4822]: I1010 08:10:24.146692 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-9fn4l" event={"ID":"130de137-8831-4c2b-9188-aea0f9950b9e","Type":"ContainerDied","Data":"f4e8a5154e8ba45a329dd8314acee4a635e1652f47edf4bad9e1ac2e2536e3cb"} Oct 10 08:10:24 crc kubenswrapper[4822]: I1010 08:10:24.147021 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4e8a5154e8ba45a329dd8314acee4a635e1652f47edf4bad9e1ac2e2536e3cb" Oct 10 08:10:24 crc kubenswrapper[4822]: I1010 08:10:24.146787 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-9fn4l" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.433861 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-730f-account-create-2llzq"] Oct 10 08:10:30 crc kubenswrapper[4822]: E1010 08:10:30.434711 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130de137-8831-4c2b-9188-aea0f9950b9e" containerName="mariadb-database-create" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.434727 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="130de137-8831-4c2b-9188-aea0f9950b9e" containerName="mariadb-database-create" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.435074 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="130de137-8831-4c2b-9188-aea0f9950b9e" containerName="mariadb-database-create" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.435997 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.439512 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.448094 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-730f-account-create-2llzq"] Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.511103 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtts\" (UniqueName: \"kubernetes.io/projected/ada20cfb-1fd7-4710-bcd3-f105fed432fa-kube-api-access-7wtts\") pod \"manila-730f-account-create-2llzq\" (UID: \"ada20cfb-1fd7-4710-bcd3-f105fed432fa\") " pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.613892 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtts\" (UniqueName: \"kubernetes.io/projected/ada20cfb-1fd7-4710-bcd3-f105fed432fa-kube-api-access-7wtts\") pod \"manila-730f-account-create-2llzq\" (UID: \"ada20cfb-1fd7-4710-bcd3-f105fed432fa\") " pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.634342 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtts\" (UniqueName: \"kubernetes.io/projected/ada20cfb-1fd7-4710-bcd3-f105fed432fa-kube-api-access-7wtts\") pod \"manila-730f-account-create-2llzq\" (UID: \"ada20cfb-1fd7-4710-bcd3-f105fed432fa\") " pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:30 crc kubenswrapper[4822]: I1010 08:10:30.763030 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:31 crc kubenswrapper[4822]: W1010 08:10:31.228760 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada20cfb_1fd7_4710_bcd3_f105fed432fa.slice/crio-5c5d5d4aa4e3778b7fca14706216e2dc3747af4185352768f39948ab63d93aae WatchSource:0}: Error finding container 5c5d5d4aa4e3778b7fca14706216e2dc3747af4185352768f39948ab63d93aae: Status 404 returned error can't find the container with id 5c5d5d4aa4e3778b7fca14706216e2dc3747af4185352768f39948ab63d93aae Oct 10 08:10:31 crc kubenswrapper[4822]: I1010 08:10:31.229949 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-730f-account-create-2llzq"] Oct 10 08:10:31 crc kubenswrapper[4822]: I1010 08:10:31.336324 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:10:31 crc kubenswrapper[4822]: I1010 08:10:31.336395 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:10:31 crc kubenswrapper[4822]: I1010 08:10:31.544526 4822 scope.go:117] "RemoveContainer" containerID="14a75ec7486424f6d351c35963558619cc274df17be776d2456eddb0c40729f5" Oct 10 08:10:32 crc kubenswrapper[4822]: I1010 08:10:32.239014 4822 generic.go:334] "Generic (PLEG): container finished" podID="ada20cfb-1fd7-4710-bcd3-f105fed432fa" containerID="759e6df25a0e08c62e8ce28baa854098a8c3859f8a036b91e2380376fc92de24" exitCode=0 Oct 10 08:10:32 crc kubenswrapper[4822]: I1010 08:10:32.239088 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-730f-account-create-2llzq" event={"ID":"ada20cfb-1fd7-4710-bcd3-f105fed432fa","Type":"ContainerDied","Data":"759e6df25a0e08c62e8ce28baa854098a8c3859f8a036b91e2380376fc92de24"} Oct 10 08:10:32 crc kubenswrapper[4822]: I1010 08:10:32.239155 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-730f-account-create-2llzq" event={"ID":"ada20cfb-1fd7-4710-bcd3-f105fed432fa","Type":"ContainerStarted","Data":"5c5d5d4aa4e3778b7fca14706216e2dc3747af4185352768f39948ab63d93aae"} Oct 10 08:10:33 crc kubenswrapper[4822]: I1010 08:10:33.667006 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:33 crc kubenswrapper[4822]: I1010 08:10:33.794086 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wtts\" (UniqueName: \"kubernetes.io/projected/ada20cfb-1fd7-4710-bcd3-f105fed432fa-kube-api-access-7wtts\") pod \"ada20cfb-1fd7-4710-bcd3-f105fed432fa\" (UID: \"ada20cfb-1fd7-4710-bcd3-f105fed432fa\") " Oct 10 08:10:33 crc kubenswrapper[4822]: I1010 08:10:33.804406 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada20cfb-1fd7-4710-bcd3-f105fed432fa-kube-api-access-7wtts" (OuterVolumeSpecName: "kube-api-access-7wtts") pod "ada20cfb-1fd7-4710-bcd3-f105fed432fa" (UID: "ada20cfb-1fd7-4710-bcd3-f105fed432fa"). InnerVolumeSpecName "kube-api-access-7wtts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:33 crc kubenswrapper[4822]: I1010 08:10:33.898466 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wtts\" (UniqueName: \"kubernetes.io/projected/ada20cfb-1fd7-4710-bcd3-f105fed432fa-kube-api-access-7wtts\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:34 crc kubenswrapper[4822]: I1010 08:10:34.265778 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-730f-account-create-2llzq" event={"ID":"ada20cfb-1fd7-4710-bcd3-f105fed432fa","Type":"ContainerDied","Data":"5c5d5d4aa4e3778b7fca14706216e2dc3747af4185352768f39948ab63d93aae"} Oct 10 08:10:34 crc kubenswrapper[4822]: I1010 08:10:34.265871 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c5d5d4aa4e3778b7fca14706216e2dc3747af4185352768f39948ab63d93aae" Oct 10 08:10:34 crc kubenswrapper[4822]: I1010 08:10:34.265950 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-730f-account-create-2llzq" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.773389 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-6jcn9"] Oct 10 08:10:35 crc kubenswrapper[4822]: E1010 08:10:35.774236 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada20cfb-1fd7-4710-bcd3-f105fed432fa" containerName="mariadb-account-create" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.774255 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada20cfb-1fd7-4710-bcd3-f105fed432fa" containerName="mariadb-account-create" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.774566 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada20cfb-1fd7-4710-bcd3-f105fed432fa" containerName="mariadb-account-create" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.775567 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.777490 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-z67h7" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.777583 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.788954 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-6jcn9"] Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.945141 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psvst\" (UniqueName: \"kubernetes.io/projected/c40b9ac3-db63-4902-8b68-ba81adf704f9-kube-api-access-psvst\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.945208 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-job-config-data\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.945364 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-config-data\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:35 crc kubenswrapper[4822]: I1010 08:10:35.945477 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-combined-ca-bundle\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.047127 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-config-data\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.047252 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-combined-ca-bundle\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.047308 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psvst\" (UniqueName: \"kubernetes.io/projected/c40b9ac3-db63-4902-8b68-ba81adf704f9-kube-api-access-psvst\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.047330 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-job-config-data\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.054280 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-combined-ca-bundle\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.055348 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-job-config-data\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.064102 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-config-data\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.070590 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psvst\" (UniqueName: \"kubernetes.io/projected/c40b9ac3-db63-4902-8b68-ba81adf704f9-kube-api-access-psvst\") pod \"manila-db-sync-6jcn9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.095177 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.389348 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 08:10:36 crc kubenswrapper[4822]: I1010 08:10:36.991338 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-6jcn9"] Oct 10 08:10:37 crc kubenswrapper[4822]: I1010 08:10:37.317304 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-6jcn9" event={"ID":"c40b9ac3-db63-4902-8b68-ba81adf704f9","Type":"ContainerStarted","Data":"2a10692bea3fff3d45e9cff86a732113193b835933b4cb195baa305a70c1657e"} Oct 10 08:10:41 crc kubenswrapper[4822]: I1010 08:10:41.362834 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-6jcn9" event={"ID":"c40b9ac3-db63-4902-8b68-ba81adf704f9","Type":"ContainerStarted","Data":"a7455e852761aac14f1f9b4a302a731e4c0c6848ca81da2758cefa26b7677809"} Oct 10 08:10:41 crc kubenswrapper[4822]: I1010 08:10:41.381517 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-6jcn9" podStartSLOduration=2.49397724 podStartE2EDuration="6.381494642s" podCreationTimestamp="2025-10-10 08:10:35 +0000 UTC" firstStartedPulling="2025-10-10 08:10:37.000247657 +0000 UTC m=+6384.095405863" lastFinishedPulling="2025-10-10 08:10:40.887765059 +0000 UTC m=+6387.982923265" observedRunningTime="2025-10-10 08:10:41.375963883 +0000 UTC m=+6388.471122089" watchObservedRunningTime="2025-10-10 08:10:41.381494642 +0000 UTC m=+6388.476652838" Oct 10 08:10:43 crc kubenswrapper[4822]: I1010 08:10:43.384634 4822 generic.go:334] "Generic (PLEG): container finished" podID="c40b9ac3-db63-4902-8b68-ba81adf704f9" containerID="a7455e852761aac14f1f9b4a302a731e4c0c6848ca81da2758cefa26b7677809" exitCode=0 Oct 10 08:10:43 crc kubenswrapper[4822]: I1010 08:10:43.384698 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-6jcn9" event={"ID":"c40b9ac3-db63-4902-8b68-ba81adf704f9","Type":"ContainerDied","Data":"a7455e852761aac14f1f9b4a302a731e4c0c6848ca81da2758cefa26b7677809"} Oct 10 08:10:44 crc kubenswrapper[4822]: I1010 08:10:44.897310 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.073628 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psvst\" (UniqueName: \"kubernetes.io/projected/c40b9ac3-db63-4902-8b68-ba81adf704f9-kube-api-access-psvst\") pod \"c40b9ac3-db63-4902-8b68-ba81adf704f9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.074255 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-config-data\") pod \"c40b9ac3-db63-4902-8b68-ba81adf704f9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.074722 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-combined-ca-bundle\") pod \"c40b9ac3-db63-4902-8b68-ba81adf704f9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.074935 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-job-config-data\") pod \"c40b9ac3-db63-4902-8b68-ba81adf704f9\" (UID: \"c40b9ac3-db63-4902-8b68-ba81adf704f9\") " Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.080961 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "c40b9ac3-db63-4902-8b68-ba81adf704f9" (UID: "c40b9ac3-db63-4902-8b68-ba81adf704f9"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.081091 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40b9ac3-db63-4902-8b68-ba81adf704f9-kube-api-access-psvst" (OuterVolumeSpecName: "kube-api-access-psvst") pod "c40b9ac3-db63-4902-8b68-ba81adf704f9" (UID: "c40b9ac3-db63-4902-8b68-ba81adf704f9"). InnerVolumeSpecName "kube-api-access-psvst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.087377 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-config-data" (OuterVolumeSpecName: "config-data") pod "c40b9ac3-db63-4902-8b68-ba81adf704f9" (UID: "c40b9ac3-db63-4902-8b68-ba81adf704f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.113037 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c40b9ac3-db63-4902-8b68-ba81adf704f9" (UID: "c40b9ac3-db63-4902-8b68-ba81adf704f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.178569 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.178605 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.178617 4822 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/c40b9ac3-db63-4902-8b68-ba81adf704f9-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.178628 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psvst\" (UniqueName: \"kubernetes.io/projected/c40b9ac3-db63-4902-8b68-ba81adf704f9-kube-api-access-psvst\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.405604 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-6jcn9" event={"ID":"c40b9ac3-db63-4902-8b68-ba81adf704f9","Type":"ContainerDied","Data":"2a10692bea3fff3d45e9cff86a732113193b835933b4cb195baa305a70c1657e"} Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.405647 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a10692bea3fff3d45e9cff86a732113193b835933b4cb195baa305a70c1657e" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.405716 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-6jcn9" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.785353 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 10 08:10:45 crc kubenswrapper[4822]: E1010 08:10:45.786212 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40b9ac3-db63-4902-8b68-ba81adf704f9" containerName="manila-db-sync" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.786230 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40b9ac3-db63-4902-8b68-ba81adf704f9" containerName="manila-db-sync" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.786463 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40b9ac3-db63-4902-8b68-ba81adf704f9" containerName="manila-db-sync" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.787741 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.791015 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.804119 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.809767 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-z67h7" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.809975 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.904565 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-ceph\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.904613 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.904708 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.904767 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.904945 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.905153 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r7q4\" (UniqueName: \"kubernetes.io/projected/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-kube-api-access-2r7q4\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.905208 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-config-data\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.905293 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.905361 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-scripts\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.909970 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.914083 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.928755 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.948619 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.965256 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56d9666d69-2mg5m"] Oct 10 08:10:45 crc kubenswrapper[4822]: I1010 08:10:45.968379 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.010875 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhk9\" (UniqueName: \"kubernetes.io/projected/5c51bb29-a67c-42a9-8243-6a8281745cc0-kube-api-access-tmhk9\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.010914 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.010957 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-scripts\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011106 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-config\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011194 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-nb\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011322 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-ceph\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011364 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011521 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-sb\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011608 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-dns-svc\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011743 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011868 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011904 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.004447 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d9666d69-2mg5m"] Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011957 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c51bb29-a67c-42a9-8243-6a8281745cc0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.011993 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-scripts\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012083 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012276 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012398 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r7q4\" (UniqueName: \"kubernetes.io/projected/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-kube-api-access-2r7q4\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012458 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-config-data\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012486 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxm82\" (UniqueName: \"kubernetes.io/projected/4cf57931-2464-4634-bc1b-94f5bc60fe5a-kube-api-access-dxm82\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012611 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.012642 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-config-data\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.024467 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-scripts\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.035091 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-config-data\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.050292 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.052059 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.054854 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r7q4\" (UniqueName: \"kubernetes.io/projected/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-kube-api-access-2r7q4\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.069076 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/694e4feb-bdf7-42c7-b0d1-7e5adb7a0444-ceph\") pod \"manila-share-share1-0\" (UID: \"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444\") " pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.107633 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.110478 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.112574 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114307 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114362 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c51bb29-a67c-42a9-8243-6a8281745cc0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114395 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-scripts\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114466 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxm82\" (UniqueName: \"kubernetes.io/projected/4cf57931-2464-4634-bc1b-94f5bc60fe5a-kube-api-access-dxm82\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114539 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-config-data\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114578 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhk9\" (UniqueName: \"kubernetes.io/projected/5c51bb29-a67c-42a9-8243-6a8281745cc0-kube-api-access-tmhk9\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114599 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114611 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c51bb29-a67c-42a9-8243-6a8281745cc0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114653 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-config\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114680 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-nb\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114750 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-sb\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.114814 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-dns-svc\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.115789 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-dns-svc\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.116933 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-nb\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.117082 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-sb\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.117376 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-config\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.118401 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.118710 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.119526 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-scripts\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.123564 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c51bb29-a67c-42a9-8243-6a8281745cc0-config-data\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.140101 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.141109 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhk9\" (UniqueName: \"kubernetes.io/projected/5c51bb29-a67c-42a9-8243-6a8281745cc0-kube-api-access-tmhk9\") pod \"manila-scheduler-0\" (UID: \"5c51bb29-a67c-42a9-8243-6a8281745cc0\") " pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.145327 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxm82\" (UniqueName: \"kubernetes.io/projected/4cf57931-2464-4634-bc1b-94f5bc60fe5a-kube-api-access-dxm82\") pod \"dnsmasq-dns-56d9666d69-2mg5m\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.152387 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218447 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10f6deb2-9123-429f-b3a6-7febb7f832e2-etc-machine-id\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218595 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-scripts\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218635 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10f6deb2-9123-429f-b3a6-7febb7f832e2-logs\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218686 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-config-data-custom\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218729 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwc5g\" (UniqueName: \"kubernetes.io/projected/10f6deb2-9123-429f-b3a6-7febb7f832e2-kube-api-access-pwc5g\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218839 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-config-data\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.218898 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.243445 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.308456 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.321974 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-config-data\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.322053 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.322151 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10f6deb2-9123-429f-b3a6-7febb7f832e2-etc-machine-id\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.322233 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-scripts\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.322252 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10f6deb2-9123-429f-b3a6-7febb7f832e2-logs\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.322302 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-config-data-custom\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.322340 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwc5g\" (UniqueName: \"kubernetes.io/projected/10f6deb2-9123-429f-b3a6-7febb7f832e2-kube-api-access-pwc5g\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.323472 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10f6deb2-9123-429f-b3a6-7febb7f832e2-logs\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.324665 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10f6deb2-9123-429f-b3a6-7febb7f832e2-etc-machine-id\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.330818 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-config-data-custom\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.336052 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.340872 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-scripts\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.345692 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f6deb2-9123-429f-b3a6-7febb7f832e2-config-data\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.347193 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwc5g\" (UniqueName: \"kubernetes.io/projected/10f6deb2-9123-429f-b3a6-7febb7f832e2-kube-api-access-pwc5g\") pod \"manila-api-0\" (UID: \"10f6deb2-9123-429f-b3a6-7febb7f832e2\") " pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.363260 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.855694 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 10 08:10:46 crc kubenswrapper[4822]: I1010 08:10:46.949983 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d9666d69-2mg5m"] Oct 10 08:10:46 crc kubenswrapper[4822]: W1010 08:10:46.953912 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf57931_2464_4634_bc1b_94f5bc60fe5a.slice/crio-3e7ccea69303c66d8dc6104a4153e5a1fea639d560be529cecde447868005d87 WatchSource:0}: Error finding container 3e7ccea69303c66d8dc6104a4153e5a1fea639d560be529cecde447868005d87: Status 404 returned error can't find the container with id 3e7ccea69303c66d8dc6104a4153e5a1fea639d560be529cecde447868005d87 Oct 10 08:10:47 crc kubenswrapper[4822]: I1010 08:10:47.100054 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 10 08:10:47 crc kubenswrapper[4822]: W1010 08:10:47.166157 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c51bb29_a67c_42a9_8243_6a8281745cc0.slice/crio-60a66b6b79d1bf9ec81f15f079d844300c33c3fbbdcc0d2e950fe306930f9701 WatchSource:0}: Error finding container 60a66b6b79d1bf9ec81f15f079d844300c33c3fbbdcc0d2e950fe306930f9701: Status 404 returned error can't find the container with id 60a66b6b79d1bf9ec81f15f079d844300c33c3fbbdcc0d2e950fe306930f9701 Oct 10 08:10:47 crc kubenswrapper[4822]: I1010 08:10:47.299335 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 10 08:10:47 crc kubenswrapper[4822]: W1010 08:10:47.346839 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10f6deb2_9123_429f_b3a6_7febb7f832e2.slice/crio-71b2831961b7ea0f95b55e9d9e742c04e89c1b725ce6573b5d4c07542f5cfeb0 WatchSource:0}: Error finding container 71b2831961b7ea0f95b55e9d9e742c04e89c1b725ce6573b5d4c07542f5cfeb0: Status 404 returned error can't find the container with id 71b2831961b7ea0f95b55e9d9e742c04e89c1b725ce6573b5d4c07542f5cfeb0 Oct 10 08:10:47 crc kubenswrapper[4822]: I1010 08:10:47.458712 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444","Type":"ContainerStarted","Data":"f49ea2cb388ba124a9402d1fd3efd7e34bfa3dba0e0ba090759d46c926d3e44c"} Oct 10 08:10:47 crc kubenswrapper[4822]: I1010 08:10:47.468043 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"10f6deb2-9123-429f-b3a6-7febb7f832e2","Type":"ContainerStarted","Data":"71b2831961b7ea0f95b55e9d9e742c04e89c1b725ce6573b5d4c07542f5cfeb0"} Oct 10 08:10:47 crc kubenswrapper[4822]: I1010 08:10:47.469361 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" event={"ID":"4cf57931-2464-4634-bc1b-94f5bc60fe5a","Type":"ContainerStarted","Data":"3e7ccea69303c66d8dc6104a4153e5a1fea639d560be529cecde447868005d87"} Oct 10 08:10:47 crc kubenswrapper[4822]: I1010 08:10:47.470301 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5c51bb29-a67c-42a9-8243-6a8281745cc0","Type":"ContainerStarted","Data":"60a66b6b79d1bf9ec81f15f079d844300c33c3fbbdcc0d2e950fe306930f9701"} Oct 10 08:10:48 crc kubenswrapper[4822]: I1010 08:10:48.485610 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"10f6deb2-9123-429f-b3a6-7febb7f832e2","Type":"ContainerStarted","Data":"912d04aa3ec8d17c5f46bcc8924b450d7821d52ea63a9d51fc85a9626bf41c08"} Oct 10 08:10:48 crc kubenswrapper[4822]: I1010 08:10:48.489237 4822 generic.go:334] "Generic (PLEG): container finished" podID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerID="040423b4f92cba0fd7984724feb76a11a99eefdcf79570dcd95d4868fd190c0d" exitCode=0 Oct 10 08:10:48 crc kubenswrapper[4822]: I1010 08:10:48.489311 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" event={"ID":"4cf57931-2464-4634-bc1b-94f5bc60fe5a","Type":"ContainerDied","Data":"040423b4f92cba0fd7984724feb76a11a99eefdcf79570dcd95d4868fd190c0d"} Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.502900 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" event={"ID":"4cf57931-2464-4634-bc1b-94f5bc60fe5a","Type":"ContainerStarted","Data":"37ab66428e8a85fe2aaad30a96eab82d62dda22e91481fec03503ff653b431b4"} Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.503585 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.506534 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5c51bb29-a67c-42a9-8243-6a8281745cc0","Type":"ContainerStarted","Data":"b332710cc18693f49f7daa3674c860c639a872e16daa5736c1bca9ff596d0187"} Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.506574 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5c51bb29-a67c-42a9-8243-6a8281745cc0","Type":"ContainerStarted","Data":"62a1290ce50787caaccac4ba7a41e7345564c468fe1066ddbed1fa3f1ef7c2e6"} Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.509451 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"10f6deb2-9123-429f-b3a6-7febb7f832e2","Type":"ContainerStarted","Data":"5e6832350f9c4628e4d263fa7021e41b83151707dcad29e556314722397e9e84"} Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.509659 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.527896 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" podStartSLOduration=4.52787103 podStartE2EDuration="4.52787103s" podCreationTimestamp="2025-10-10 08:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:10:49.524951916 +0000 UTC m=+6396.620110152" watchObservedRunningTime="2025-10-10 08:10:49.52787103 +0000 UTC m=+6396.623029266" Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.567229 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.567206234 podStartE2EDuration="3.567206234s" podCreationTimestamp="2025-10-10 08:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:10:49.555497627 +0000 UTC m=+6396.650655823" watchObservedRunningTime="2025-10-10 08:10:49.567206234 +0000 UTC m=+6396.662364430" Oct 10 08:10:49 crc kubenswrapper[4822]: I1010 08:10:49.585015 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.820996722 podStartE2EDuration="4.584971806s" podCreationTimestamp="2025-10-10 08:10:45 +0000 UTC" firstStartedPulling="2025-10-10 08:10:47.249845108 +0000 UTC m=+6394.345003304" lastFinishedPulling="2025-10-10 08:10:48.013820192 +0000 UTC m=+6395.108978388" observedRunningTime="2025-10-10 08:10:49.578080398 +0000 UTC m=+6396.673238604" watchObservedRunningTime="2025-10-10 08:10:49.584971806 +0000 UTC m=+6396.680130002" Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.244558 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.310031 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.379044 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4f97f5f5-tkv86"] Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.382167 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerName="dnsmasq-dns" containerID="cri-o://63783dd67353e256011aa2a561c080cd64dc8b4910149f251c16ef327e54280a" gracePeriod=10 Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.626688 4822 generic.go:334] "Generic (PLEG): container finished" podID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerID="63783dd67353e256011aa2a561c080cd64dc8b4910149f251c16ef327e54280a" exitCode=0 Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.627043 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" event={"ID":"7cd06439-9c72-4f40-b9b5-326e11904d84","Type":"ContainerDied","Data":"63783dd67353e256011aa2a561c080cd64dc8b4910149f251c16ef327e54280a"} Oct 10 08:10:56 crc kubenswrapper[4822]: I1010 08:10:56.629880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444","Type":"ContainerStarted","Data":"0218fdd0749c19977f5734a9788a90424056d5d8e8fffbf006b88372032da2ed"} Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.062216 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.122185 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-config\") pod \"7cd06439-9c72-4f40-b9b5-326e11904d84\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.122291 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-sb\") pod \"7cd06439-9c72-4f40-b9b5-326e11904d84\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.122315 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-nb\") pod \"7cd06439-9c72-4f40-b9b5-326e11904d84\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.122566 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2g44\" (UniqueName: \"kubernetes.io/projected/7cd06439-9c72-4f40-b9b5-326e11904d84-kube-api-access-p2g44\") pod \"7cd06439-9c72-4f40-b9b5-326e11904d84\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.122720 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-dns-svc\") pod \"7cd06439-9c72-4f40-b9b5-326e11904d84\" (UID: \"7cd06439-9c72-4f40-b9b5-326e11904d84\") " Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.132064 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd06439-9c72-4f40-b9b5-326e11904d84-kube-api-access-p2g44" (OuterVolumeSpecName: "kube-api-access-p2g44") pod "7cd06439-9c72-4f40-b9b5-326e11904d84" (UID: "7cd06439-9c72-4f40-b9b5-326e11904d84"). InnerVolumeSpecName "kube-api-access-p2g44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.201033 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cd06439-9c72-4f40-b9b5-326e11904d84" (UID: "7cd06439-9c72-4f40-b9b5-326e11904d84"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.202543 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-config" (OuterVolumeSpecName: "config") pod "7cd06439-9c72-4f40-b9b5-326e11904d84" (UID: "7cd06439-9c72-4f40-b9b5-326e11904d84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.213306 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cd06439-9c72-4f40-b9b5-326e11904d84" (UID: "7cd06439-9c72-4f40-b9b5-326e11904d84"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.226642 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2g44\" (UniqueName: \"kubernetes.io/projected/7cd06439-9c72-4f40-b9b5-326e11904d84-kube-api-access-p2g44\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.226693 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.226707 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.226719 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.240110 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cd06439-9c72-4f40-b9b5-326e11904d84" (UID: "7cd06439-9c72-4f40-b9b5-326e11904d84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.329055 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd06439-9c72-4f40-b9b5-326e11904d84-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.645515 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"694e4feb-bdf7-42c7-b0d1-7e5adb7a0444","Type":"ContainerStarted","Data":"7c98086f4ec6e69f5ddf8b9773ab0d1d775c24a5d5b4faa722e4ace3a9d53096"} Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.648380 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" event={"ID":"7cd06439-9c72-4f40-b9b5-326e11904d84","Type":"ContainerDied","Data":"e58a5c948a868ec63a6a8cce711e876d6a2889f3b4aaacf6b0831b658ae27e0e"} Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.648466 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4f97f5f5-tkv86" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.648491 4822 scope.go:117] "RemoveContainer" containerID="63783dd67353e256011aa2a561c080cd64dc8b4910149f251c16ef327e54280a" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.680965 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.930256163 podStartE2EDuration="12.680940871s" podCreationTimestamp="2025-10-10 08:10:45 +0000 UTC" firstStartedPulling="2025-10-10 08:10:46.858152307 +0000 UTC m=+6393.953310503" lastFinishedPulling="2025-10-10 08:10:55.608836995 +0000 UTC m=+6402.703995211" observedRunningTime="2025-10-10 08:10:57.678081079 +0000 UTC m=+6404.773239295" watchObservedRunningTime="2025-10-10 08:10:57.680940871 +0000 UTC m=+6404.776099067" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.707923 4822 scope.go:117] "RemoveContainer" containerID="660ef52da5c705a0adeee9720ec10ce6491567a54816055e67188d0cf8d57cb0" Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.713403 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4f97f5f5-tkv86"] Oct 10 08:10:57 crc kubenswrapper[4822]: I1010 08:10:57.734390 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c4f97f5f5-tkv86"] Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.273418 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.274161 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="sg-core" containerID="cri-o://c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d" gracePeriod=30 Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.274162 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="proxy-httpd" containerID="cri-o://1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33" gracePeriod=30 Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.274349 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-notification-agent" containerID="cri-o://e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f" gracePeriod=30 Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.275937 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-central-agent" containerID="cri-o://74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf" gracePeriod=30 Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.667487 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" path="/var/lib/kubelet/pods/7cd06439-9c72-4f40-b9b5-326e11904d84/volumes" Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.677299 4822 generic.go:334] "Generic (PLEG): container finished" podID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerID="1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33" exitCode=0 Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.677337 4822 generic.go:334] "Generic (PLEG): container finished" podID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerID="c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d" exitCode=2 Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.677361 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerDied","Data":"1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33"} Oct 10 08:10:59 crc kubenswrapper[4822]: I1010 08:10:59.677390 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerDied","Data":"c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d"} Oct 10 08:11:00 crc kubenswrapper[4822]: I1010 08:11:00.690324 4822 generic.go:334] "Generic (PLEG): container finished" podID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerID="74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf" exitCode=0 Oct 10 08:11:00 crc kubenswrapper[4822]: I1010 08:11:00.690652 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerDied","Data":"74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf"} Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.337177 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.337268 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.337328 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.338682 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.338828 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" gracePeriod=600 Oct 10 08:11:01 crc kubenswrapper[4822]: E1010 08:11:01.466012 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.700689 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" exitCode=0 Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.700769 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab"} Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.701092 4822 scope.go:117] "RemoveContainer" containerID="7499bf42536c70e0590eda28c91355142cca10772ff9c8c9df99fd82114164a5" Oct 10 08:11:01 crc kubenswrapper[4822]: I1010 08:11:01.701875 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:11:01 crc kubenswrapper[4822]: E1010 08:11:01.702122 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.530753 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.589886 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-combined-ca-bundle\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.590070 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcln5\" (UniqueName: \"kubernetes.io/projected/dc2ca913-7c75-44f2-968d-f3aa14024b37-kube-api-access-tcln5\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.590123 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-sg-core-conf-yaml\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.590145 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-run-httpd\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.590246 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-config-data\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.590332 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-scripts\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.590391 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-log-httpd\") pod \"dc2ca913-7c75-44f2-968d-f3aa14024b37\" (UID: \"dc2ca913-7c75-44f2-968d-f3aa14024b37\") " Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.591282 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.592093 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.602743 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2ca913-7c75-44f2-968d-f3aa14024b37-kube-api-access-tcln5" (OuterVolumeSpecName: "kube-api-access-tcln5") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "kube-api-access-tcln5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.620731 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-scripts" (OuterVolumeSpecName: "scripts") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.680862 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.695152 4822 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.695182 4822 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.695193 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcln5\" (UniqueName: \"kubernetes.io/projected/dc2ca913-7c75-44f2-968d-f3aa14024b37-kube-api-access-tcln5\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.695202 4822 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.695210 4822 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc2ca913-7c75-44f2-968d-f3aa14024b37-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.713957 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.718773 4822 generic.go:334] "Generic (PLEG): container finished" podID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerID="e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f" exitCode=0 Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.719068 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.719086 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerDied","Data":"e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f"} Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.720300 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc2ca913-7c75-44f2-968d-f3aa14024b37","Type":"ContainerDied","Data":"28cfaa77bc70330a35e61755e1737ebf7404c031d96b8e431b5d43667f1bf52c"} Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.720405 4822 scope.go:117] "RemoveContainer" containerID="1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.762959 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-config-data" (OuterVolumeSpecName: "config-data") pod "dc2ca913-7c75-44f2-968d-f3aa14024b37" (UID: "dc2ca913-7c75-44f2-968d-f3aa14024b37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.797796 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.797844 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2ca913-7c75-44f2-968d-f3aa14024b37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.820988 4822 scope.go:117] "RemoveContainer" containerID="c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.845621 4822 scope.go:117] "RemoveContainer" containerID="e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.868309 4822 scope.go:117] "RemoveContainer" containerID="74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.899198 4822 scope.go:117] "RemoveContainer" containerID="1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33" Oct 10 08:11:02 crc kubenswrapper[4822]: E1010 08:11:02.899741 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33\": container with ID starting with 1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33 not found: ID does not exist" containerID="1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.899875 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33"} err="failed to get container status \"1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33\": rpc error: code = NotFound desc = could not find container \"1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33\": container with ID starting with 1ff6e2a538244b4aa9f5a3318a1aa7fe9b7e45c2d23f35b4e53ee498c4f02f33 not found: ID does not exist" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.899987 4822 scope.go:117] "RemoveContainer" containerID="c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d" Oct 10 08:11:02 crc kubenswrapper[4822]: E1010 08:11:02.900323 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d\": container with ID starting with c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d not found: ID does not exist" containerID="c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.900358 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d"} err="failed to get container status \"c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d\": rpc error: code = NotFound desc = could not find container \"c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d\": container with ID starting with c1a5187d5a7930a8920eb613387f642174edffe2911f3e1de1a864e37a56956d not found: ID does not exist" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.900382 4822 scope.go:117] "RemoveContainer" containerID="e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f" Oct 10 08:11:02 crc kubenswrapper[4822]: E1010 08:11:02.900732 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f\": container with ID starting with e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f not found: ID does not exist" containerID="e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.900787 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f"} err="failed to get container status \"e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f\": rpc error: code = NotFound desc = could not find container \"e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f\": container with ID starting with e5c509719e070871d16d5eba8d952fab47c05acd6f81e7b11e90bd6482ea707f not found: ID does not exist" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.900841 4822 scope.go:117] "RemoveContainer" containerID="74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf" Oct 10 08:11:02 crc kubenswrapper[4822]: E1010 08:11:02.901179 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf\": container with ID starting with 74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf not found: ID does not exist" containerID="74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf" Oct 10 08:11:02 crc kubenswrapper[4822]: I1010 08:11:02.901223 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf"} err="failed to get container status \"74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf\": rpc error: code = NotFound desc = could not find container \"74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf\": container with ID starting with 74ab12bd0bed401f123c06ab2ac5259c571d9463294345ddc0e73c65c008d2cf not found: ID does not exist" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.053644 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.063103 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083289 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:11:03 crc kubenswrapper[4822]: E1010 08:11:03.083757 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-central-agent" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083776 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-central-agent" Oct 10 08:11:03 crc kubenswrapper[4822]: E1010 08:11:03.083799 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerName="init" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083824 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerName="init" Oct 10 08:11:03 crc kubenswrapper[4822]: E1010 08:11:03.083848 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="sg-core" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083857 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="sg-core" Oct 10 08:11:03 crc kubenswrapper[4822]: E1010 08:11:03.083876 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerName="dnsmasq-dns" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083884 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerName="dnsmasq-dns" Oct 10 08:11:03 crc kubenswrapper[4822]: E1010 08:11:03.083894 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-notification-agent" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083903 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-notification-agent" Oct 10 08:11:03 crc kubenswrapper[4822]: E1010 08:11:03.083915 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="proxy-httpd" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.083921 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="proxy-httpd" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.084165 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="sg-core" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.084182 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd06439-9c72-4f40-b9b5-326e11904d84" containerName="dnsmasq-dns" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.084201 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="proxy-httpd" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.084214 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-notification-agent" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.084237 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" containerName="ceilometer-central-agent" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.090419 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.093080 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.093811 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.110798 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206353 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50a32b63-179a-49ea-a6b9-badbc0008449-log-httpd\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206404 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-config-data\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206433 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206517 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50a32b63-179a-49ea-a6b9-badbc0008449-run-httpd\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206583 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-scripts\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206627 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.206691 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ljpp\" (UniqueName: \"kubernetes.io/projected/50a32b63-179a-49ea-a6b9-badbc0008449-kube-api-access-7ljpp\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309015 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50a32b63-179a-49ea-a6b9-badbc0008449-run-httpd\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309087 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-scripts\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309122 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309173 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ljpp\" (UniqueName: \"kubernetes.io/projected/50a32b63-179a-49ea-a6b9-badbc0008449-kube-api-access-7ljpp\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309233 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50a32b63-179a-49ea-a6b9-badbc0008449-log-httpd\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309252 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-config-data\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309269 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.309534 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50a32b63-179a-49ea-a6b9-badbc0008449-run-httpd\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.310001 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50a32b63-179a-49ea-a6b9-badbc0008449-log-httpd\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.316598 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.316722 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-config-data\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.317090 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.324151 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50a32b63-179a-49ea-a6b9-badbc0008449-scripts\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.330182 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ljpp\" (UniqueName: \"kubernetes.io/projected/50a32b63-179a-49ea-a6b9-badbc0008449-kube-api-access-7ljpp\") pod \"ceilometer-0\" (UID: \"50a32b63-179a-49ea-a6b9-badbc0008449\") " pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.408013 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.691844 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2ca913-7c75-44f2-968d-f3aa14024b37" path="/var/lib/kubelet/pods/dc2ca913-7c75-44f2-968d-f3aa14024b37/volumes" Oct 10 08:11:03 crc kubenswrapper[4822]: W1010 08:11:03.933116 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a32b63_179a_49ea_a6b9_badbc0008449.slice/crio-11f09aac4a0cec6d625d24e887b677c96e77a1f3d01603ae1f880e9a21f26a39 WatchSource:0}: Error finding container 11f09aac4a0cec6d625d24e887b677c96e77a1f3d01603ae1f880e9a21f26a39: Status 404 returned error can't find the container with id 11f09aac4a0cec6d625d24e887b677c96e77a1f3d01603ae1f880e9a21f26a39 Oct 10 08:11:03 crc kubenswrapper[4822]: I1010 08:11:03.936861 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:11:04 crc kubenswrapper[4822]: I1010 08:11:04.743842 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50a32b63-179a-49ea-a6b9-badbc0008449","Type":"ContainerStarted","Data":"11f09aac4a0cec6d625d24e887b677c96e77a1f3d01603ae1f880e9a21f26a39"} Oct 10 08:11:05 crc kubenswrapper[4822]: I1010 08:11:05.754858 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50a32b63-179a-49ea-a6b9-badbc0008449","Type":"ContainerStarted","Data":"c1539a3eb1f558c472f887fb8871f3b7b815c58e246ab1b5ec008f94824b04d0"} Oct 10 08:11:05 crc kubenswrapper[4822]: I1010 08:11:05.755214 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50a32b63-179a-49ea-a6b9-badbc0008449","Type":"ContainerStarted","Data":"bffdee07f61d288cfb36a27b8dfb447b89b004bb3fce7ca8ff5b6bc24460b0bd"} Oct 10 08:11:06 crc kubenswrapper[4822]: I1010 08:11:06.154188 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 10 08:11:06 crc kubenswrapper[4822]: I1010 08:11:06.770997 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50a32b63-179a-49ea-a6b9-badbc0008449","Type":"ContainerStarted","Data":"45701ce8752def69fd6df8c0545b55e1c159ecda05d50a91c5557704f5dc38bb"} Oct 10 08:11:07 crc kubenswrapper[4822]: I1010 08:11:07.742658 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 10 08:11:07 crc kubenswrapper[4822]: I1010 08:11:07.769251 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 10 08:11:07 crc kubenswrapper[4822]: I1010 08:11:07.808145 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50a32b63-179a-49ea-a6b9-badbc0008449","Type":"ContainerStarted","Data":"3de016ac5eecad255dff020acf543df7fe80c206216e48e1e0b2f3b1a6cbdbde"} Oct 10 08:11:07 crc kubenswrapper[4822]: I1010 08:11:07.808836 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 08:11:07 crc kubenswrapper[4822]: I1010 08:11:07.852526 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4200685800000001 podStartE2EDuration="4.852503992s" podCreationTimestamp="2025-10-10 08:11:03 +0000 UTC" firstStartedPulling="2025-10-10 08:11:03.936284573 +0000 UTC m=+6411.031442769" lastFinishedPulling="2025-10-10 08:11:07.368719965 +0000 UTC m=+6414.463878181" observedRunningTime="2025-10-10 08:11:07.842760211 +0000 UTC m=+6414.937918417" watchObservedRunningTime="2025-10-10 08:11:07.852503992 +0000 UTC m=+6414.947662188" Oct 10 08:11:08 crc kubenswrapper[4822]: I1010 08:11:08.011182 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 10 08:11:16 crc kubenswrapper[4822]: I1010 08:11:16.650378 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:11:16 crc kubenswrapper[4822]: E1010 08:11:16.651238 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:11:29 crc kubenswrapper[4822]: I1010 08:11:29.650927 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:11:29 crc kubenswrapper[4822]: E1010 08:11:29.651843 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:11:33 crc kubenswrapper[4822]: I1010 08:11:33.413411 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 08:11:40 crc kubenswrapper[4822]: I1010 08:11:40.651637 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:11:40 crc kubenswrapper[4822]: E1010 08:11:40.653096 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.556604 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b654b794c-6x92r"] Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.559940 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.564133 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.579504 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b654b794c-6x92r"] Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.651125 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:11:55 crc kubenswrapper[4822]: E1010 08:11:55.651494 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.703357 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-nb\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.704004 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-dns-svc\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.704043 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-openstack-cell1\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.704143 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-config\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.704271 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2s5h\" (UniqueName: \"kubernetes.io/projected/0e090f19-c034-433b-a291-f5c3bb3ffb09-kube-api-access-f2s5h\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.704335 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-sb\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.806570 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-dns-svc\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.806652 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-openstack-cell1\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.806833 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-config\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.806949 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2s5h\" (UniqueName: \"kubernetes.io/projected/0e090f19-c034-433b-a291-f5c3bb3ffb09-kube-api-access-f2s5h\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.807043 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-sb\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.807088 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-nb\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.807859 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-dns-svc\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.808336 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-config\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.808825 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-openstack-cell1\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.809530 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-sb\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.809571 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-nb\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.851430 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2s5h\" (UniqueName: \"kubernetes.io/projected/0e090f19-c034-433b-a291-f5c3bb3ffb09-kube-api-access-f2s5h\") pod \"dnsmasq-dns-b654b794c-6x92r\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:55 crc kubenswrapper[4822]: I1010 08:11:55.893934 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:56 crc kubenswrapper[4822]: I1010 08:11:56.570720 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b654b794c-6x92r"] Oct 10 08:11:57 crc kubenswrapper[4822]: I1010 08:11:57.400362 4822 generic.go:334] "Generic (PLEG): container finished" podID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerID="53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c" exitCode=0 Oct 10 08:11:57 crc kubenswrapper[4822]: I1010 08:11:57.400452 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b654b794c-6x92r" event={"ID":"0e090f19-c034-433b-a291-f5c3bb3ffb09","Type":"ContainerDied","Data":"53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c"} Oct 10 08:11:57 crc kubenswrapper[4822]: I1010 08:11:57.401297 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b654b794c-6x92r" event={"ID":"0e090f19-c034-433b-a291-f5c3bb3ffb09","Type":"ContainerStarted","Data":"4ac3780ac2cf2d7055cc54dc253dc6d888bb8e35817c03e8afd18b3d15cd0385"} Oct 10 08:11:58 crc kubenswrapper[4822]: I1010 08:11:58.415153 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b654b794c-6x92r" event={"ID":"0e090f19-c034-433b-a291-f5c3bb3ffb09","Type":"ContainerStarted","Data":"df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a"} Oct 10 08:11:58 crc kubenswrapper[4822]: I1010 08:11:58.417480 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:11:58 crc kubenswrapper[4822]: I1010 08:11:58.474861 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b654b794c-6x92r" podStartSLOduration=3.474836906 podStartE2EDuration="3.474836906s" podCreationTimestamp="2025-10-10 08:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:11:58.456389804 +0000 UTC m=+6465.551548010" watchObservedRunningTime="2025-10-10 08:11:58.474836906 +0000 UTC m=+6465.569995122" Oct 10 08:12:05 crc kubenswrapper[4822]: I1010 08:12:05.896077 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:12:05 crc kubenswrapper[4822]: I1010 08:12:05.962684 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d9666d69-2mg5m"] Oct 10 08:12:05 crc kubenswrapper[4822]: I1010 08:12:05.963276 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="dnsmasq-dns" containerID="cri-o://37ab66428e8a85fe2aaad30a96eab82d62dda22e91481fec03503ff653b431b4" gracePeriod=10 Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.192047 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c69776bc-xvf4j"] Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.195071 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.219633 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c69776bc-xvf4j"] Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.298753 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-ovsdbserver-nb\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.298912 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-ovsdbserver-sb\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.298954 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8hn\" (UniqueName: \"kubernetes.io/projected/a8abe278-5814-4951-afa4-3e2c66259376-kube-api-access-2z8hn\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.299035 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-dns-svc\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.299116 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-config\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.299162 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-openstack-cell1\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.402535 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-dns-svc\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.403231 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-config\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.403969 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-dns-svc\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.404982 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-config\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.405447 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-openstack-cell1\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.406238 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-openstack-cell1\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.406729 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-ovsdbserver-nb\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.407572 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-ovsdbserver-sb\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.407510 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-ovsdbserver-nb\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.408448 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8abe278-5814-4951-afa4-3e2c66259376-ovsdbserver-sb\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.408747 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8hn\" (UniqueName: \"kubernetes.io/projected/a8abe278-5814-4951-afa4-3e2c66259376-kube-api-access-2z8hn\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.431139 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8hn\" (UniqueName: \"kubernetes.io/projected/a8abe278-5814-4951-afa4-3e2c66259376-kube-api-access-2z8hn\") pod \"dnsmasq-dns-56c69776bc-xvf4j\" (UID: \"a8abe278-5814-4951-afa4-3e2c66259376\") " pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.512487 4822 generic.go:334] "Generic (PLEG): container finished" podID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerID="37ab66428e8a85fe2aaad30a96eab82d62dda22e91481fec03503ff653b431b4" exitCode=0 Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.512564 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" event={"ID":"4cf57931-2464-4634-bc1b-94f5bc60fe5a","Type":"ContainerDied","Data":"37ab66428e8a85fe2aaad30a96eab82d62dda22e91481fec03503ff653b431b4"} Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.540706 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.651301 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:12:06 crc kubenswrapper[4822]: E1010 08:12:06.651712 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.728268 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.919522 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxm82\" (UniqueName: \"kubernetes.io/projected/4cf57931-2464-4634-bc1b-94f5bc60fe5a-kube-api-access-dxm82\") pod \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.919788 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-config\") pod \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.919829 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-dns-svc\") pod \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.920023 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-sb\") pod \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.920219 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-nb\") pod \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\" (UID: \"4cf57931-2464-4634-bc1b-94f5bc60fe5a\") " Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.926622 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf57931-2464-4634-bc1b-94f5bc60fe5a-kube-api-access-dxm82" (OuterVolumeSpecName: "kube-api-access-dxm82") pod "4cf57931-2464-4634-bc1b-94f5bc60fe5a" (UID: "4cf57931-2464-4634-bc1b-94f5bc60fe5a"). InnerVolumeSpecName "kube-api-access-dxm82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.993190 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cf57931-2464-4634-bc1b-94f5bc60fe5a" (UID: "4cf57931-2464-4634-bc1b-94f5bc60fe5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:06 crc kubenswrapper[4822]: I1010 08:12:06.997204 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cf57931-2464-4634-bc1b-94f5bc60fe5a" (UID: "4cf57931-2464-4634-bc1b-94f5bc60fe5a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.009791 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-config" (OuterVolumeSpecName: "config") pod "4cf57931-2464-4634-bc1b-94f5bc60fe5a" (UID: "4cf57931-2464-4634-bc1b-94f5bc60fe5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.023295 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cf57931-2464-4634-bc1b-94f5bc60fe5a" (UID: "4cf57931-2464-4634-bc1b-94f5bc60fe5a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.024600 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.024627 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxm82\" (UniqueName: \"kubernetes.io/projected/4cf57931-2464-4634-bc1b-94f5bc60fe5a-kube-api-access-dxm82\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.024640 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.024652 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.024660 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cf57931-2464-4634-bc1b-94f5bc60fe5a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.121347 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c69776bc-xvf4j"] Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.525094 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" event={"ID":"4cf57931-2464-4634-bc1b-94f5bc60fe5a","Type":"ContainerDied","Data":"3e7ccea69303c66d8dc6104a4153e5a1fea639d560be529cecde447868005d87"} Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.525575 4822 scope.go:117] "RemoveContainer" containerID="37ab66428e8a85fe2aaad30a96eab82d62dda22e91481fec03503ff653b431b4" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.525728 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.536518 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" event={"ID":"a8abe278-5814-4951-afa4-3e2c66259376","Type":"ContainerStarted","Data":"8d572706e7133c904ee81bc2d2ed1f23d5047018cf4d9e04544cb3d23e415889"} Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.536566 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" event={"ID":"a8abe278-5814-4951-afa4-3e2c66259376","Type":"ContainerStarted","Data":"b8425c18cd5f65031a18c8a554f900e0e394555bca891040fdd4656c948dba8f"} Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.567659 4822 scope.go:117] "RemoveContainer" containerID="040423b4f92cba0fd7984724feb76a11a99eefdcf79570dcd95d4868fd190c0d" Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.571067 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d9666d69-2mg5m"] Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.580188 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56d9666d69-2mg5m"] Oct 10 08:12:07 crc kubenswrapper[4822]: I1010 08:12:07.685303 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" path="/var/lib/kubelet/pods/4cf57931-2464-4634-bc1b-94f5bc60fe5a/volumes" Oct 10 08:12:08 crc kubenswrapper[4822]: I1010 08:12:08.550520 4822 generic.go:334] "Generic (PLEG): container finished" podID="a8abe278-5814-4951-afa4-3e2c66259376" containerID="8d572706e7133c904ee81bc2d2ed1f23d5047018cf4d9e04544cb3d23e415889" exitCode=0 Oct 10 08:12:08 crc kubenswrapper[4822]: I1010 08:12:08.550627 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" event={"ID":"a8abe278-5814-4951-afa4-3e2c66259376","Type":"ContainerDied","Data":"8d572706e7133c904ee81bc2d2ed1f23d5047018cf4d9e04544cb3d23e415889"} Oct 10 08:12:09 crc kubenswrapper[4822]: I1010 08:12:09.565107 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" event={"ID":"a8abe278-5814-4951-afa4-3e2c66259376","Type":"ContainerStarted","Data":"402a427555920adc09b9191e57708ff2d5bb419ebd85b1137244e6e8b3996a8d"} Oct 10 08:12:09 crc kubenswrapper[4822]: I1010 08:12:09.565795 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:09 crc kubenswrapper[4822]: I1010 08:12:09.588820 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" podStartSLOduration=3.588784154 podStartE2EDuration="3.588784154s" podCreationTimestamp="2025-10-10 08:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:12:09.587443585 +0000 UTC m=+6476.682601791" watchObservedRunningTime="2025-10-10 08:12:09.588784154 +0000 UTC m=+6476.683942340" Oct 10 08:12:11 crc kubenswrapper[4822]: I1010 08:12:11.311686 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56d9666d69-2mg5m" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.149:5353: i/o timeout" Oct 10 08:12:16 crc kubenswrapper[4822]: I1010 08:12:16.543152 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56c69776bc-xvf4j" Oct 10 08:12:16 crc kubenswrapper[4822]: I1010 08:12:16.667473 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b654b794c-6x92r"] Oct 10 08:12:16 crc kubenswrapper[4822]: I1010 08:12:16.667775 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b654b794c-6x92r" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerName="dnsmasq-dns" containerID="cri-o://df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a" gracePeriod=10 Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.312435 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.344270 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-openstack-cell1\") pod \"0e090f19-c034-433b-a291-f5c3bb3ffb09\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.344361 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-dns-svc\") pod \"0e090f19-c034-433b-a291-f5c3bb3ffb09\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.344489 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-sb\") pod \"0e090f19-c034-433b-a291-f5c3bb3ffb09\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.344658 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2s5h\" (UniqueName: \"kubernetes.io/projected/0e090f19-c034-433b-a291-f5c3bb3ffb09-kube-api-access-f2s5h\") pod \"0e090f19-c034-433b-a291-f5c3bb3ffb09\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.344724 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-nb\") pod \"0e090f19-c034-433b-a291-f5c3bb3ffb09\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.344789 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-config\") pod \"0e090f19-c034-433b-a291-f5c3bb3ffb09\" (UID: \"0e090f19-c034-433b-a291-f5c3bb3ffb09\") " Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.361105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e090f19-c034-433b-a291-f5c3bb3ffb09-kube-api-access-f2s5h" (OuterVolumeSpecName: "kube-api-access-f2s5h") pod "0e090f19-c034-433b-a291-f5c3bb3ffb09" (UID: "0e090f19-c034-433b-a291-f5c3bb3ffb09"). InnerVolumeSpecName "kube-api-access-f2s5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.460466 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2s5h\" (UniqueName: \"kubernetes.io/projected/0e090f19-c034-433b-a291-f5c3bb3ffb09-kube-api-access-f2s5h\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.484832 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "0e090f19-c034-433b-a291-f5c3bb3ffb09" (UID: "0e090f19-c034-433b-a291-f5c3bb3ffb09"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.498552 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0e090f19-c034-433b-a291-f5c3bb3ffb09" (UID: "0e090f19-c034-433b-a291-f5c3bb3ffb09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.518083 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0e090f19-c034-433b-a291-f5c3bb3ffb09" (UID: "0e090f19-c034-433b-a291-f5c3bb3ffb09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.525362 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e090f19-c034-433b-a291-f5c3bb3ffb09" (UID: "0e090f19-c034-433b-a291-f5c3bb3ffb09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.531166 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-config" (OuterVolumeSpecName: "config") pod "0e090f19-c034-433b-a291-f5c3bb3ffb09" (UID: "0e090f19-c034-433b-a291-f5c3bb3ffb09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.563108 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.563147 4822 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.563157 4822 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.563165 4822 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.563173 4822 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e090f19-c034-433b-a291-f5c3bb3ffb09-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.685490 4822 generic.go:334] "Generic (PLEG): container finished" podID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerID="df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a" exitCode=0 Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.685579 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b654b794c-6x92r" event={"ID":"0e090f19-c034-433b-a291-f5c3bb3ffb09","Type":"ContainerDied","Data":"df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a"} Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.685602 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b654b794c-6x92r" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.685627 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b654b794c-6x92r" event={"ID":"0e090f19-c034-433b-a291-f5c3bb3ffb09","Type":"ContainerDied","Data":"4ac3780ac2cf2d7055cc54dc253dc6d888bb8e35817c03e8afd18b3d15cd0385"} Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.685649 4822 scope.go:117] "RemoveContainer" containerID="df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.710010 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b654b794c-6x92r"] Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.712937 4822 scope.go:117] "RemoveContainer" containerID="53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.721787 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b654b794c-6x92r"] Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.736684 4822 scope.go:117] "RemoveContainer" containerID="df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a" Oct 10 08:12:17 crc kubenswrapper[4822]: E1010 08:12:17.737152 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a\": container with ID starting with df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a not found: ID does not exist" containerID="df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.737202 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a"} err="failed to get container status \"df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a\": rpc error: code = NotFound desc = could not find container \"df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a\": container with ID starting with df4e1536281b56e1f715d4d373061cab049a01a89aec1adb4ce977dca86cfd7a not found: ID does not exist" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.737229 4822 scope.go:117] "RemoveContainer" containerID="53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c" Oct 10 08:12:17 crc kubenswrapper[4822]: E1010 08:12:17.737519 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c\": container with ID starting with 53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c not found: ID does not exist" containerID="53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c" Oct 10 08:12:17 crc kubenswrapper[4822]: I1010 08:12:17.737538 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c"} err="failed to get container status \"53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c\": rpc error: code = NotFound desc = could not find container \"53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c\": container with ID starting with 53a56c785daa95ee30088f8fe05117c151cda0789d0ec54c00684fe13e8cb27c not found: ID does not exist" Oct 10 08:12:18 crc kubenswrapper[4822]: I1010 08:12:18.651273 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:12:18 crc kubenswrapper[4822]: E1010 08:12:18.652005 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:12:19 crc kubenswrapper[4822]: I1010 08:12:19.665637 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" path="/var/lib/kubelet/pods/0e090f19-c034-433b-a291-f5c3bb3ffb09/volumes" Oct 10 08:12:21 crc kubenswrapper[4822]: I1010 08:12:21.063651 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-xzrsl"] Oct 10 08:12:21 crc kubenswrapper[4822]: I1010 08:12:21.080084 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-xzrsl"] Oct 10 08:12:21 crc kubenswrapper[4822]: I1010 08:12:21.666755 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb74e893-7ef5-491e-8375-722cc4449667" path="/var/lib/kubelet/pods/fb74e893-7ef5-491e-8375-722cc4449667/volumes" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.299523 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f"] Oct 10 08:12:27 crc kubenswrapper[4822]: E1010 08:12:27.300623 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerName="init" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.300640 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerName="init" Oct 10 08:12:27 crc kubenswrapper[4822]: E1010 08:12:27.300672 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="init" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.300681 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="init" Oct 10 08:12:27 crc kubenswrapper[4822]: E1010 08:12:27.300701 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerName="dnsmasq-dns" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.300730 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerName="dnsmasq-dns" Oct 10 08:12:27 crc kubenswrapper[4822]: E1010 08:12:27.300752 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="dnsmasq-dns" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.300759 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="dnsmasq-dns" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.301044 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf57931-2464-4634-bc1b-94f5bc60fe5a" containerName="dnsmasq-dns" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.301057 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e090f19-c034-433b-a291-f5c3bb3ffb09" containerName="dnsmasq-dns" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.302001 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.304552 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.305083 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.306170 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.307077 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.374470 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f"] Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.420211 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.420286 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.420353 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zjsc\" (UniqueName: \"kubernetes.io/projected/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-kube-api-access-4zjsc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.420407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.420760 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.522468 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.522565 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.522600 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjsc\" (UniqueName: \"kubernetes.io/projected/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-kube-api-access-4zjsc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.522630 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.522717 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.528410 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.528480 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.529117 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.542939 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.553306 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zjsc\" (UniqueName: \"kubernetes.io/projected/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-kube-api-access-4zjsc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c5286f\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:27 crc kubenswrapper[4822]: I1010 08:12:27.634975 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:28 crc kubenswrapper[4822]: I1010 08:12:28.399182 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f"] Oct 10 08:12:28 crc kubenswrapper[4822]: W1010 08:12:28.402014 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod352cc9ff_7a6d_4e9e_b662_5051df6b3be7.slice/crio-980bade43dfaa75be1e22dd497b7a5dcc608392e0c18ebf9cf8e2be3bcaf5052 WatchSource:0}: Error finding container 980bade43dfaa75be1e22dd497b7a5dcc608392e0c18ebf9cf8e2be3bcaf5052: Status 404 returned error can't find the container with id 980bade43dfaa75be1e22dd497b7a5dcc608392e0c18ebf9cf8e2be3bcaf5052 Oct 10 08:12:28 crc kubenswrapper[4822]: I1010 08:12:28.823315 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" event={"ID":"352cc9ff-7a6d-4e9e-b662-5051df6b3be7","Type":"ContainerStarted","Data":"980bade43dfaa75be1e22dd497b7a5dcc608392e0c18ebf9cf8e2be3bcaf5052"} Oct 10 08:12:31 crc kubenswrapper[4822]: I1010 08:12:31.728972 4822 scope.go:117] "RemoveContainer" containerID="aaae650c9d44f7b16749f5cab3702d985b0db1b9bda4f6850439de4631735b08" Oct 10 08:12:33 crc kubenswrapper[4822]: I1010 08:12:33.661148 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:12:33 crc kubenswrapper[4822]: E1010 08:12:33.666041 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:12:34 crc kubenswrapper[4822]: I1010 08:12:34.040404 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-8707-account-create-vrjp5"] Oct 10 08:12:34 crc kubenswrapper[4822]: I1010 08:12:34.052730 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-8707-account-create-vrjp5"] Oct 10 08:12:35 crc kubenswrapper[4822]: I1010 08:12:35.698492 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd92ddf-425c-4223-b122-93691db8c391" path="/var/lib/kubelet/pods/2cd92ddf-425c-4223-b122-93691db8c391/volumes" Oct 10 08:12:36 crc kubenswrapper[4822]: I1010 08:12:36.628334 4822 scope.go:117] "RemoveContainer" containerID="bd69a984a9ed5778c5cf62ad5f6b0532eb1330edeb59889f3fbfefcb7a785a76" Oct 10 08:12:36 crc kubenswrapper[4822]: I1010 08:12:36.856797 4822 scope.go:117] "RemoveContainer" containerID="ca00d5e808188ce282ab8ed1846df94d429ea639ec576373609098d8b0c7bd7d" Oct 10 08:12:37 crc kubenswrapper[4822]: I1010 08:12:37.921670 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" event={"ID":"352cc9ff-7a6d-4e9e-b662-5051df6b3be7","Type":"ContainerStarted","Data":"dacead394406030e7bbe83e7c6de144c29a836ccb094bf821f6042bf74cb794c"} Oct 10 08:12:37 crc kubenswrapper[4822]: I1010 08:12:37.945038 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" podStartSLOduration=2.377112052 podStartE2EDuration="10.945015872s" podCreationTimestamp="2025-10-10 08:12:27 +0000 UTC" firstStartedPulling="2025-10-10 08:12:28.404301557 +0000 UTC m=+6495.499459753" lastFinishedPulling="2025-10-10 08:12:36.972205377 +0000 UTC m=+6504.067363573" observedRunningTime="2025-10-10 08:12:37.940258814 +0000 UTC m=+6505.035417080" watchObservedRunningTime="2025-10-10 08:12:37.945015872 +0000 UTC m=+6505.040174108" Oct 10 08:12:40 crc kubenswrapper[4822]: I1010 08:12:40.028668 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-lcs5n"] Oct 10 08:12:40 crc kubenswrapper[4822]: I1010 08:12:40.038897 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-lcs5n"] Oct 10 08:12:41 crc kubenswrapper[4822]: I1010 08:12:41.662988 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f" path="/var/lib/kubelet/pods/b8cf8062-c5d4-4bf1-8ab7-3c30f6ab0d8f/volumes" Oct 10 08:12:44 crc kubenswrapper[4822]: I1010 08:12:44.650438 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:12:44 crc kubenswrapper[4822]: E1010 08:12:44.651169 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:12:52 crc kubenswrapper[4822]: I1010 08:12:52.031591 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-1b34-account-create-5ls9k"] Oct 10 08:12:52 crc kubenswrapper[4822]: I1010 08:12:52.045212 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-1b34-account-create-5ls9k"] Oct 10 08:12:52 crc kubenswrapper[4822]: I1010 08:12:52.107565 4822 generic.go:334] "Generic (PLEG): container finished" podID="352cc9ff-7a6d-4e9e-b662-5051df6b3be7" containerID="dacead394406030e7bbe83e7c6de144c29a836ccb094bf821f6042bf74cb794c" exitCode=0 Oct 10 08:12:52 crc kubenswrapper[4822]: I1010 08:12:52.107610 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" event={"ID":"352cc9ff-7a6d-4e9e-b662-5051df6b3be7","Type":"ContainerDied","Data":"dacead394406030e7bbe83e7c6de144c29a836ccb094bf821f6042bf74cb794c"} Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.641568 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.691450 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044fd4b7-7693-4a14-ab68-9003c5ecc759" path="/var/lib/kubelet/pods/044fd4b7-7693-4a14-ab68-9003c5ecc759/volumes" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.805378 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ssh-key\") pod \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.805484 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ceph\") pod \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.805543 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-pre-adoption-validation-combined-ca-bundle\") pod \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.805570 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-inventory\") pod \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.805788 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zjsc\" (UniqueName: \"kubernetes.io/projected/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-kube-api-access-4zjsc\") pod \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\" (UID: \"352cc9ff-7a6d-4e9e-b662-5051df6b3be7\") " Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.811215 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ceph" (OuterVolumeSpecName: "ceph") pod "352cc9ff-7a6d-4e9e-b662-5051df6b3be7" (UID: "352cc9ff-7a6d-4e9e-b662-5051df6b3be7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.812318 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "352cc9ff-7a6d-4e9e-b662-5051df6b3be7" (UID: "352cc9ff-7a6d-4e9e-b662-5051df6b3be7"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.812703 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-kube-api-access-4zjsc" (OuterVolumeSpecName: "kube-api-access-4zjsc") pod "352cc9ff-7a6d-4e9e-b662-5051df6b3be7" (UID: "352cc9ff-7a6d-4e9e-b662-5051df6b3be7"). InnerVolumeSpecName "kube-api-access-4zjsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.834910 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "352cc9ff-7a6d-4e9e-b662-5051df6b3be7" (UID: "352cc9ff-7a6d-4e9e-b662-5051df6b3be7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.846687 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-inventory" (OuterVolumeSpecName: "inventory") pod "352cc9ff-7a6d-4e9e-b662-5051df6b3be7" (UID: "352cc9ff-7a6d-4e9e-b662-5051df6b3be7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.908892 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.908936 4822 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.908955 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.908968 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zjsc\" (UniqueName: \"kubernetes.io/projected/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-kube-api-access-4zjsc\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:53 crc kubenswrapper[4822]: I1010 08:12:53.908981 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/352cc9ff-7a6d-4e9e-b662-5051df6b3be7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:54 crc kubenswrapper[4822]: I1010 08:12:54.130886 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" event={"ID":"352cc9ff-7a6d-4e9e-b662-5051df6b3be7","Type":"ContainerDied","Data":"980bade43dfaa75be1e22dd497b7a5dcc608392e0c18ebf9cf8e2be3bcaf5052"} Oct 10 08:12:54 crc kubenswrapper[4822]: I1010 08:12:54.130943 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c5286f" Oct 10 08:12:54 crc kubenswrapper[4822]: I1010 08:12:54.130962 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980bade43dfaa75be1e22dd497b7a5dcc608392e0c18ebf9cf8e2be3bcaf5052" Oct 10 08:12:58 crc kubenswrapper[4822]: I1010 08:12:58.651165 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:12:58 crc kubenswrapper[4822]: E1010 08:12:58.652013 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.012256 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl"] Oct 10 08:13:01 crc kubenswrapper[4822]: E1010 08:13:01.014214 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352cc9ff-7a6d-4e9e-b662-5051df6b3be7" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.014324 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="352cc9ff-7a6d-4e9e-b662-5051df6b3be7" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.014743 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="352cc9ff-7a6d-4e9e-b662-5051df6b3be7" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.017394 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.019635 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.020250 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.020458 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.023099 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.026619 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl"] Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.086464 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdzvr\" (UniqueName: \"kubernetes.io/projected/cb5a26a4-8ae3-4b96-a905-70be164e9198-kube-api-access-cdzvr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.087152 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.087426 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.087487 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.087827 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.189869 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.190257 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.190279 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.190350 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.190390 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdzvr\" (UniqueName: \"kubernetes.io/projected/cb5a26a4-8ae3-4b96-a905-70be164e9198-kube-api-access-cdzvr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.198330 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.199586 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.202948 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.204587 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.208865 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdzvr\" (UniqueName: \"kubernetes.io/projected/cb5a26a4-8ae3-4b96-a905-70be164e9198-kube-api-access-cdzvr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.350584 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:13:01 crc kubenswrapper[4822]: I1010 08:13:01.894525 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl"] Oct 10 08:13:02 crc kubenswrapper[4822]: I1010 08:13:02.217211 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" event={"ID":"cb5a26a4-8ae3-4b96-a905-70be164e9198","Type":"ContainerStarted","Data":"e7b8d53e203b0fddbbfd7ae8dd01fc60c94c040f1a1ce9772faabe58e5a5360f"} Oct 10 08:13:03 crc kubenswrapper[4822]: I1010 08:13:03.232945 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" event={"ID":"cb5a26a4-8ae3-4b96-a905-70be164e9198","Type":"ContainerStarted","Data":"93995f98556f396cdf49bdf8ef00217a74ebcc70e5a36c7ebc867cc73d6635bb"} Oct 10 08:13:03 crc kubenswrapper[4822]: I1010 08:13:03.255856 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" podStartSLOduration=2.754566153 podStartE2EDuration="3.255838594s" podCreationTimestamp="2025-10-10 08:13:00 +0000 UTC" firstStartedPulling="2025-10-10 08:13:01.901142021 +0000 UTC m=+6528.996300227" lastFinishedPulling="2025-10-10 08:13:02.402414452 +0000 UTC m=+6529.497572668" observedRunningTime="2025-10-10 08:13:03.254009111 +0000 UTC m=+6530.349167307" watchObservedRunningTime="2025-10-10 08:13:03.255838594 +0000 UTC m=+6530.350996790" Oct 10 08:13:10 crc kubenswrapper[4822]: I1010 08:13:10.650871 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:13:10 crc kubenswrapper[4822]: E1010 08:13:10.652213 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:13:25 crc kubenswrapper[4822]: I1010 08:13:25.651272 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:13:25 crc kubenswrapper[4822]: E1010 08:13:25.652274 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:13:37 crc kubenswrapper[4822]: I1010 08:13:37.079379 4822 scope.go:117] "RemoveContainer" containerID="d588d39a8b8b662002493c412e920b10057591498234ff49d0cd35c93e689026" Oct 10 08:13:37 crc kubenswrapper[4822]: I1010 08:13:37.113535 4822 scope.go:117] "RemoveContainer" containerID="482bcca52713a1bb97d49d481a621df5d0978353e2bcd4a1cb6e8f1b34120d48" Oct 10 08:13:37 crc kubenswrapper[4822]: I1010 08:13:37.174741 4822 scope.go:117] "RemoveContainer" containerID="04cdb3a6edb7543c92fab6c5e7549b2ca5d6862bb901ea4fd2358f087fb6856b" Oct 10 08:13:38 crc kubenswrapper[4822]: I1010 08:13:38.066660 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-k7mlg"] Oct 10 08:13:38 crc kubenswrapper[4822]: I1010 08:13:38.077511 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-k7mlg"] Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.542258 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79xtc"] Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.545791 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.554937 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79xtc"] Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.603656 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-utilities\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.603998 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbglg\" (UniqueName: \"kubernetes.io/projected/89ce0c93-f7fb-4f4f-97b8-deda7480e216-kube-api-access-tbglg\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.604078 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-catalog-content\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.663128 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7877f7a6-99da-42ea-9e23-57b3ac4612de" path="/var/lib/kubelet/pods/7877f7a6-99da-42ea-9e23-57b3ac4612de/volumes" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.706642 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-catalog-content\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.706776 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-utilities\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.707017 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbglg\" (UniqueName: \"kubernetes.io/projected/89ce0c93-f7fb-4f4f-97b8-deda7480e216-kube-api-access-tbglg\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.707314 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-catalog-content\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.708622 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-utilities\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.730372 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbglg\" (UniqueName: \"kubernetes.io/projected/89ce0c93-f7fb-4f4f-97b8-deda7480e216-kube-api-access-tbglg\") pod \"redhat-marketplace-79xtc\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:39 crc kubenswrapper[4822]: I1010 08:13:39.887026 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:40 crc kubenswrapper[4822]: I1010 08:13:40.410175 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79xtc"] Oct 10 08:13:40 crc kubenswrapper[4822]: I1010 08:13:40.650451 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:13:40 crc kubenswrapper[4822]: E1010 08:13:40.652263 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:13:40 crc kubenswrapper[4822]: I1010 08:13:40.703342 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerStarted","Data":"6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668"} Oct 10 08:13:40 crc kubenswrapper[4822]: I1010 08:13:40.707738 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerStarted","Data":"5814e41ba014576d5813b60967488459e2aa23f1bd6795524cd9189ca571e2f7"} Oct 10 08:13:41 crc kubenswrapper[4822]: I1010 08:13:41.713667 4822 generic.go:334] "Generic (PLEG): container finished" podID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerID="6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668" exitCode=0 Oct 10 08:13:41 crc kubenswrapper[4822]: I1010 08:13:41.713783 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerDied","Data":"6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668"} Oct 10 08:13:41 crc kubenswrapper[4822]: I1010 08:13:41.715560 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:13:43 crc kubenswrapper[4822]: I1010 08:13:43.735706 4822 generic.go:334] "Generic (PLEG): container finished" podID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerID="158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf" exitCode=0 Oct 10 08:13:43 crc kubenswrapper[4822]: I1010 08:13:43.736041 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerDied","Data":"158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf"} Oct 10 08:13:44 crc kubenswrapper[4822]: I1010 08:13:44.751879 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerStarted","Data":"d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c"} Oct 10 08:13:44 crc kubenswrapper[4822]: I1010 08:13:44.791117 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79xtc" podStartSLOduration=3.309660828 podStartE2EDuration="5.791098714s" podCreationTimestamp="2025-10-10 08:13:39 +0000 UTC" firstStartedPulling="2025-10-10 08:13:41.715272622 +0000 UTC m=+6568.810430838" lastFinishedPulling="2025-10-10 08:13:44.196710528 +0000 UTC m=+6571.291868724" observedRunningTime="2025-10-10 08:13:44.780528779 +0000 UTC m=+6571.875686985" watchObservedRunningTime="2025-10-10 08:13:44.791098714 +0000 UTC m=+6571.886256920" Oct 10 08:13:49 crc kubenswrapper[4822]: I1010 08:13:49.887824 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:49 crc kubenswrapper[4822]: I1010 08:13:49.888308 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:49 crc kubenswrapper[4822]: I1010 08:13:49.976883 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:50 crc kubenswrapper[4822]: I1010 08:13:50.905153 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:50 crc kubenswrapper[4822]: I1010 08:13:50.958568 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79xtc"] Oct 10 08:13:52 crc kubenswrapper[4822]: I1010 08:13:52.851182 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79xtc" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="registry-server" containerID="cri-o://d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c" gracePeriod=2 Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.371885 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.462001 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-utilities\") pod \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.462196 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbglg\" (UniqueName: \"kubernetes.io/projected/89ce0c93-f7fb-4f4f-97b8-deda7480e216-kube-api-access-tbglg\") pod \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.462364 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-catalog-content\") pod \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\" (UID: \"89ce0c93-f7fb-4f4f-97b8-deda7480e216\") " Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.462740 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-utilities" (OuterVolumeSpecName: "utilities") pod "89ce0c93-f7fb-4f4f-97b8-deda7480e216" (UID: "89ce0c93-f7fb-4f4f-97b8-deda7480e216"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.463721 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.469975 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ce0c93-f7fb-4f4f-97b8-deda7480e216-kube-api-access-tbglg" (OuterVolumeSpecName: "kube-api-access-tbglg") pod "89ce0c93-f7fb-4f4f-97b8-deda7480e216" (UID: "89ce0c93-f7fb-4f4f-97b8-deda7480e216"). InnerVolumeSpecName "kube-api-access-tbglg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.484294 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89ce0c93-f7fb-4f4f-97b8-deda7480e216" (UID: "89ce0c93-f7fb-4f4f-97b8-deda7480e216"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.566989 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89ce0c93-f7fb-4f4f-97b8-deda7480e216-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.567056 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbglg\" (UniqueName: \"kubernetes.io/projected/89ce0c93-f7fb-4f4f-97b8-deda7480e216-kube-api-access-tbglg\") on node \"crc\" DevicePath \"\"" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.864894 4822 generic.go:334] "Generic (PLEG): container finished" podID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerID="d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c" exitCode=0 Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.864958 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerDied","Data":"d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c"} Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.865014 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79xtc" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.865042 4822 scope.go:117] "RemoveContainer" containerID="d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.865024 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79xtc" event={"ID":"89ce0c93-f7fb-4f4f-97b8-deda7480e216","Type":"ContainerDied","Data":"5814e41ba014576d5813b60967488459e2aa23f1bd6795524cd9189ca571e2f7"} Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.909575 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79xtc"] Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.912224 4822 scope.go:117] "RemoveContainer" containerID="158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf" Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.922824 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79xtc"] Oct 10 08:13:53 crc kubenswrapper[4822]: I1010 08:13:53.946892 4822 scope.go:117] "RemoveContainer" containerID="6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.002853 4822 scope.go:117] "RemoveContainer" containerID="d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c" Oct 10 08:13:54 crc kubenswrapper[4822]: E1010 08:13:54.003279 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c\": container with ID starting with d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c not found: ID does not exist" containerID="d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.003343 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c"} err="failed to get container status \"d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c\": rpc error: code = NotFound desc = could not find container \"d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c\": container with ID starting with d736db7c35f5c956b005a2d79cb3d45abf5149800abe0b3723f7c0270cce7e7c not found: ID does not exist" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.003370 4822 scope.go:117] "RemoveContainer" containerID="158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf" Oct 10 08:13:54 crc kubenswrapper[4822]: E1010 08:13:54.003839 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf\": container with ID starting with 158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf not found: ID does not exist" containerID="158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.003948 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf"} err="failed to get container status \"158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf\": rpc error: code = NotFound desc = could not find container \"158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf\": container with ID starting with 158f764c973c1e7bddecb5bf34aab9b5939a065dcf379716e621ff1ef7eebcbf not found: ID does not exist" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.004001 4822 scope.go:117] "RemoveContainer" containerID="6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668" Oct 10 08:13:54 crc kubenswrapper[4822]: E1010 08:13:54.004371 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668\": container with ID starting with 6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668 not found: ID does not exist" containerID="6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.004416 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668"} err="failed to get container status \"6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668\": rpc error: code = NotFound desc = could not find container \"6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668\": container with ID starting with 6abd7ceca779704548a7264fa5c9151fe62202bc163dec2fdc3e167b61092668 not found: ID does not exist" Oct 10 08:13:54 crc kubenswrapper[4822]: I1010 08:13:54.650959 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:13:54 crc kubenswrapper[4822]: E1010 08:13:54.651362 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:13:55 crc kubenswrapper[4822]: I1010 08:13:55.673679 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" path="/var/lib/kubelet/pods/89ce0c93-f7fb-4f4f-97b8-deda7480e216/volumes" Oct 10 08:14:08 crc kubenswrapper[4822]: I1010 08:14:08.651337 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:14:08 crc kubenswrapper[4822]: E1010 08:14:08.652113 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:14:20 crc kubenswrapper[4822]: I1010 08:14:20.652378 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:14:20 crc kubenswrapper[4822]: E1010 08:14:20.653892 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:14:33 crc kubenswrapper[4822]: I1010 08:14:33.665265 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:14:33 crc kubenswrapper[4822]: E1010 08:14:33.666725 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:14:37 crc kubenswrapper[4822]: I1010 08:14:37.305868 4822 scope.go:117] "RemoveContainer" containerID="775f60e87eee22eb77e41cb2b5e36d4b6186213609467c37676e194dc6574eb5" Oct 10 08:14:37 crc kubenswrapper[4822]: I1010 08:14:37.350677 4822 scope.go:117] "RemoveContainer" containerID="58da1e8080e70ed5b34ed9647cfebc630f63aa6446632666982b3185eed1bc11" Oct 10 08:14:48 crc kubenswrapper[4822]: I1010 08:14:48.650471 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:14:48 crc kubenswrapper[4822]: E1010 08:14:48.651383 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.183601 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h"] Oct 10 08:15:00 crc kubenswrapper[4822]: E1010 08:15:00.184608 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="extract-content" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.184622 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="extract-content" Oct 10 08:15:00 crc kubenswrapper[4822]: E1010 08:15:00.184670 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="registry-server" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.184676 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="registry-server" Oct 10 08:15:00 crc kubenswrapper[4822]: E1010 08:15:00.184694 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="extract-utilities" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.184703 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="extract-utilities" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.184926 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ce0c93-f7fb-4f4f-97b8-deda7480e216" containerName="registry-server" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.185713 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.188472 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.188694 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.199432 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h"] Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.263880 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-config-volume\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.264523 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-secret-volume\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.264675 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2v7\" (UniqueName: \"kubernetes.io/projected/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-kube-api-access-9v2v7\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.366498 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-config-volume\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.366582 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-secret-volume\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.366629 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2v7\" (UniqueName: \"kubernetes.io/projected/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-kube-api-access-9v2v7\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.367480 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-config-volume\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.372344 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-secret-volume\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.382446 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2v7\" (UniqueName: \"kubernetes.io/projected/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-kube-api-access-9v2v7\") pod \"collect-profiles-29334735-6hh9h\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.508750 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:00 crc kubenswrapper[4822]: I1010 08:15:00.651670 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:15:00 crc kubenswrapper[4822]: E1010 08:15:00.652395 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:15:01 crc kubenswrapper[4822]: I1010 08:15:01.002159 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h"] Oct 10 08:15:01 crc kubenswrapper[4822]: I1010 08:15:01.673035 4822 generic.go:334] "Generic (PLEG): container finished" podID="d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" containerID="85d5514f4ece88ea1fe1334b2838c59e14c4c26d3accbde0d6c07158d0b271e0" exitCode=0 Oct 10 08:15:01 crc kubenswrapper[4822]: I1010 08:15:01.673146 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" event={"ID":"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97","Type":"ContainerDied","Data":"85d5514f4ece88ea1fe1334b2838c59e14c4c26d3accbde0d6c07158d0b271e0"} Oct 10 08:15:01 crc kubenswrapper[4822]: I1010 08:15:01.673365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" event={"ID":"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97","Type":"ContainerStarted","Data":"d1ad7c102d86f301e313e344c44de73152c0aab5942919f52ffd18f222033df2"} Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.046198 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.135787 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-secret-volume\") pod \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.136301 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-config-volume\") pod \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.136338 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v2v7\" (UniqueName: \"kubernetes.io/projected/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-kube-api-access-9v2v7\") pod \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\" (UID: \"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97\") " Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.136869 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-config-volume" (OuterVolumeSpecName: "config-volume") pod "d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" (UID: "d817ef48-a80c-4a94-9ca9-55cf1cfd2f97"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.141867 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" (UID: "d817ef48-a80c-4a94-9ca9-55cf1cfd2f97"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.147470 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-kube-api-access-9v2v7" (OuterVolumeSpecName: "kube-api-access-9v2v7") pod "d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" (UID: "d817ef48-a80c-4a94-9ca9-55cf1cfd2f97"). InnerVolumeSpecName "kube-api-access-9v2v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.238450 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.238487 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.238503 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v2v7\" (UniqueName: \"kubernetes.io/projected/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97-kube-api-access-9v2v7\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.693432 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" event={"ID":"d817ef48-a80c-4a94-9ca9-55cf1cfd2f97","Type":"ContainerDied","Data":"d1ad7c102d86f301e313e344c44de73152c0aab5942919f52ffd18f222033df2"} Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.693474 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ad7c102d86f301e313e344c44de73152c0aab5942919f52ffd18f222033df2" Oct 10 08:15:03 crc kubenswrapper[4822]: I1010 08:15:03.693527 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h" Oct 10 08:15:04 crc kubenswrapper[4822]: I1010 08:15:04.118965 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls"] Oct 10 08:15:04 crc kubenswrapper[4822]: I1010 08:15:04.127449 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-vfjls"] Oct 10 08:15:05 crc kubenswrapper[4822]: I1010 08:15:05.664973 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd" path="/var/lib/kubelet/pods/c2edaa2e-e00e-4ecd-95ae-6b7d5fda9bfd/volumes" Oct 10 08:15:12 crc kubenswrapper[4822]: I1010 08:15:12.650781 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:15:12 crc kubenswrapper[4822]: E1010 08:15:12.652059 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:15:24 crc kubenswrapper[4822]: I1010 08:15:24.650676 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:15:24 crc kubenswrapper[4822]: E1010 08:15:24.651628 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:15:36 crc kubenswrapper[4822]: I1010 08:15:36.651217 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:15:36 crc kubenswrapper[4822]: E1010 08:15:36.653609 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:15:37 crc kubenswrapper[4822]: I1010 08:15:37.462303 4822 scope.go:117] "RemoveContainer" containerID="ae0a478de330f43220f4507c8bc6d37313574d0121d7c7723cb350513f215dba" Oct 10 08:15:48 crc kubenswrapper[4822]: I1010 08:15:48.651604 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:15:48 crc kubenswrapper[4822]: E1010 08:15:48.653775 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:16:02 crc kubenswrapper[4822]: I1010 08:16:02.650314 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:16:03 crc kubenswrapper[4822]: I1010 08:16:03.392658 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"64afd18a7b7078060668666b54c7fb80d0d988c3b4f5d0d8e895230c1850c47c"} Oct 10 08:17:10 crc kubenswrapper[4822]: I1010 08:17:10.043983 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-6m794"] Oct 10 08:17:10 crc kubenswrapper[4822]: I1010 08:17:10.053303 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-6m794"] Oct 10 08:17:11 crc kubenswrapper[4822]: I1010 08:17:11.684679 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e836530f-40ee-424b-9b14-3456252a1b43" path="/var/lib/kubelet/pods/e836530f-40ee-424b-9b14-3456252a1b43/volumes" Oct 10 08:17:20 crc kubenswrapper[4822]: I1010 08:17:20.036500 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4e57-account-create-pvtw5"] Oct 10 08:17:20 crc kubenswrapper[4822]: I1010 08:17:20.074631 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4e57-account-create-pvtw5"] Oct 10 08:17:21 crc kubenswrapper[4822]: I1010 08:17:21.669149 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5de391-05fb-4ce3-9c38-c48a25e28b88" path="/var/lib/kubelet/pods/0e5de391-05fb-4ce3-9c38-c48a25e28b88/volumes" Oct 10 08:17:33 crc kubenswrapper[4822]: I1010 08:17:33.034529 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-pp6p9"] Oct 10 08:17:33 crc kubenswrapper[4822]: I1010 08:17:33.047971 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-pp6p9"] Oct 10 08:17:33 crc kubenswrapper[4822]: I1010 08:17:33.662210 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a701b49-d6e2-47dc-a329-9b88b163c568" path="/var/lib/kubelet/pods/7a701b49-d6e2-47dc-a329-9b88b163c568/volumes" Oct 10 08:17:37 crc kubenswrapper[4822]: I1010 08:17:37.595851 4822 scope.go:117] "RemoveContainer" containerID="1806c2420a6ead245afe58116143a58c158d4dc95b705ae7593f931eff8489c8" Oct 10 08:17:37 crc kubenswrapper[4822]: I1010 08:17:37.629165 4822 scope.go:117] "RemoveContainer" containerID="895fd2984f54460d1a23d4e51b2ea31e64165ba74401499f2ca82fe5fc664a8b" Oct 10 08:17:37 crc kubenswrapper[4822]: I1010 08:17:37.674464 4822 scope.go:117] "RemoveContainer" containerID="bac652866ea95e6d1fb45710a3ab56125adec615af37ed526cf30506df8f6966" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.622702 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9s59c"] Oct 10 08:17:56 crc kubenswrapper[4822]: E1010 08:17:56.623690 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" containerName="collect-profiles" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.623706 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" containerName="collect-profiles" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.624050 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" containerName="collect-profiles" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.626091 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.631935 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-catalog-content\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.632163 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-utilities\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.632240 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpt7v\" (UniqueName: \"kubernetes.io/projected/4f8a8536-2bb1-4469-b129-7692f01fd739-kube-api-access-jpt7v\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.636845 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9s59c"] Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.735280 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-catalog-content\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.735394 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-utilities\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.735428 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpt7v\" (UniqueName: \"kubernetes.io/projected/4f8a8536-2bb1-4469-b129-7692f01fd739-kube-api-access-jpt7v\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.735883 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-catalog-content\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.736059 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-utilities\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.756225 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpt7v\" (UniqueName: \"kubernetes.io/projected/4f8a8536-2bb1-4469-b129-7692f01fd739-kube-api-access-jpt7v\") pod \"redhat-operators-9s59c\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:56 crc kubenswrapper[4822]: I1010 08:17:56.950760 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:17:57 crc kubenswrapper[4822]: I1010 08:17:57.457297 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9s59c"] Oct 10 08:17:57 crc kubenswrapper[4822]: I1010 08:17:57.705287 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerStarted","Data":"d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596"} Oct 10 08:17:57 crc kubenswrapper[4822]: I1010 08:17:57.705620 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerStarted","Data":"173b2f2066311f0b7aad14c678eaaab7c0618e394b6abcda64ba92b94d66a4e3"} Oct 10 08:17:58 crc kubenswrapper[4822]: I1010 08:17:58.716953 4822 generic.go:334] "Generic (PLEG): container finished" podID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerID="d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596" exitCode=0 Oct 10 08:17:58 crc kubenswrapper[4822]: I1010 08:17:58.717293 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerDied","Data":"d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596"} Oct 10 08:17:59 crc kubenswrapper[4822]: I1010 08:17:59.728504 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerStarted","Data":"5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d"} Oct 10 08:18:02 crc kubenswrapper[4822]: I1010 08:18:02.772247 4822 generic.go:334] "Generic (PLEG): container finished" podID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerID="5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d" exitCode=0 Oct 10 08:18:02 crc kubenswrapper[4822]: I1010 08:18:02.772469 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerDied","Data":"5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d"} Oct 10 08:18:03 crc kubenswrapper[4822]: I1010 08:18:03.783764 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerStarted","Data":"9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad"} Oct 10 08:18:03 crc kubenswrapper[4822]: I1010 08:18:03.801369 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9s59c" podStartSLOduration=3.23318112 podStartE2EDuration="7.801350614s" podCreationTimestamp="2025-10-10 08:17:56 +0000 UTC" firstStartedPulling="2025-10-10 08:17:58.720850311 +0000 UTC m=+6825.816008527" lastFinishedPulling="2025-10-10 08:18:03.289019825 +0000 UTC m=+6830.384178021" observedRunningTime="2025-10-10 08:18:03.800383896 +0000 UTC m=+6830.895542102" watchObservedRunningTime="2025-10-10 08:18:03.801350614 +0000 UTC m=+6830.896508820" Oct 10 08:18:06 crc kubenswrapper[4822]: I1010 08:18:06.952089 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:18:06 crc kubenswrapper[4822]: I1010 08:18:06.952686 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:18:08 crc kubenswrapper[4822]: I1010 08:18:08.013153 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9s59c" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="registry-server" probeResult="failure" output=< Oct 10 08:18:08 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:18:08 crc kubenswrapper[4822]: > Oct 10 08:18:18 crc kubenswrapper[4822]: I1010 08:18:18.013084 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9s59c" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="registry-server" probeResult="failure" output=< Oct 10 08:18:18 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:18:18 crc kubenswrapper[4822]: > Oct 10 08:18:27 crc kubenswrapper[4822]: I1010 08:18:27.008038 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:18:27 crc kubenswrapper[4822]: I1010 08:18:27.061508 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:18:27 crc kubenswrapper[4822]: I1010 08:18:27.820834 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9s59c"] Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.027655 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9s59c" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="registry-server" containerID="cri-o://9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad" gracePeriod=2 Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.616841 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.699282 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpt7v\" (UniqueName: \"kubernetes.io/projected/4f8a8536-2bb1-4469-b129-7692f01fd739-kube-api-access-jpt7v\") pod \"4f8a8536-2bb1-4469-b129-7692f01fd739\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.699390 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-utilities\") pod \"4f8a8536-2bb1-4469-b129-7692f01fd739\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.699494 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-catalog-content\") pod \"4f8a8536-2bb1-4469-b129-7692f01fd739\" (UID: \"4f8a8536-2bb1-4469-b129-7692f01fd739\") " Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.700736 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-utilities" (OuterVolumeSpecName: "utilities") pod "4f8a8536-2bb1-4469-b129-7692f01fd739" (UID: "4f8a8536-2bb1-4469-b129-7692f01fd739"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.709442 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8a8536-2bb1-4469-b129-7692f01fd739-kube-api-access-jpt7v" (OuterVolumeSpecName: "kube-api-access-jpt7v") pod "4f8a8536-2bb1-4469-b129-7692f01fd739" (UID: "4f8a8536-2bb1-4469-b129-7692f01fd739"). InnerVolumeSpecName "kube-api-access-jpt7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.781036 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f8a8536-2bb1-4469-b129-7692f01fd739" (UID: "4f8a8536-2bb1-4469-b129-7692f01fd739"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.804387 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpt7v\" (UniqueName: \"kubernetes.io/projected/4f8a8536-2bb1-4469-b129-7692f01fd739-kube-api-access-jpt7v\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.804444 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:29 crc kubenswrapper[4822]: I1010 08:18:29.804459 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8a8536-2bb1-4469-b129-7692f01fd739-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.039719 4822 generic.go:334] "Generic (PLEG): container finished" podID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerID="9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad" exitCode=0 Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.039769 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9s59c" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.039779 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerDied","Data":"9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad"} Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.039837 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9s59c" event={"ID":"4f8a8536-2bb1-4469-b129-7692f01fd739","Type":"ContainerDied","Data":"173b2f2066311f0b7aad14c678eaaab7c0618e394b6abcda64ba92b94d66a4e3"} Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.039858 4822 scope.go:117] "RemoveContainer" containerID="9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.079337 4822 scope.go:117] "RemoveContainer" containerID="5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.098591 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9s59c"] Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.112477 4822 scope.go:117] "RemoveContainer" containerID="d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.115501 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9s59c"] Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.194822 4822 scope.go:117] "RemoveContainer" containerID="9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad" Oct 10 08:18:30 crc kubenswrapper[4822]: E1010 08:18:30.195365 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad\": container with ID starting with 9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad not found: ID does not exist" containerID="9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.195405 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad"} err="failed to get container status \"9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad\": rpc error: code = NotFound desc = could not find container \"9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad\": container with ID starting with 9f785b8e9764646321730c7c93dd2e935b11764d971bce18a21701308f1d31ad not found: ID does not exist" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.195432 4822 scope.go:117] "RemoveContainer" containerID="5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d" Oct 10 08:18:30 crc kubenswrapper[4822]: E1010 08:18:30.196074 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d\": container with ID starting with 5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d not found: ID does not exist" containerID="5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.196129 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d"} err="failed to get container status \"5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d\": rpc error: code = NotFound desc = could not find container \"5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d\": container with ID starting with 5a75dff3948e9e2f5e400aa9142f323fc117bb71a1f37bbd638593bed7e3fa5d not found: ID does not exist" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.196182 4822 scope.go:117] "RemoveContainer" containerID="d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596" Oct 10 08:18:30 crc kubenswrapper[4822]: E1010 08:18:30.196632 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596\": container with ID starting with d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596 not found: ID does not exist" containerID="d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596" Oct 10 08:18:30 crc kubenswrapper[4822]: I1010 08:18:30.196673 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596"} err="failed to get container status \"d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596\": rpc error: code = NotFound desc = could not find container \"d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596\": container with ID starting with d3ef51da96c104555347a0de96c4cd7a8d26892ec3db19bfd77fc8db85fe2596 not found: ID does not exist" Oct 10 08:18:31 crc kubenswrapper[4822]: I1010 08:18:31.336448 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:18:31 crc kubenswrapper[4822]: I1010 08:18:31.336533 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:18:31 crc kubenswrapper[4822]: I1010 08:18:31.671054 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" path="/var/lib/kubelet/pods/4f8a8536-2bb1-4469-b129-7692f01fd739/volumes" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.519391 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d28w9"] Oct 10 08:18:51 crc kubenswrapper[4822]: E1010 08:18:51.520548 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="extract-content" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.520566 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="extract-content" Oct 10 08:18:51 crc kubenswrapper[4822]: E1010 08:18:51.520584 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="extract-utilities" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.520595 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="extract-utilities" Oct 10 08:18:51 crc kubenswrapper[4822]: E1010 08:18:51.520605 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="registry-server" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.520612 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="registry-server" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.520903 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8a8536-2bb1-4469-b129-7692f01fd739" containerName="registry-server" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.523028 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.528531 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d28w9"] Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.531178 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-utilities\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.531213 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-catalog-content\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.531269 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdm4s\" (UniqueName: \"kubernetes.io/projected/6aadd01d-7cbc-40b9-9659-f96954f73aed-kube-api-access-gdm4s\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.632520 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-utilities\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.632573 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-catalog-content\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.632643 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdm4s\" (UniqueName: \"kubernetes.io/projected/6aadd01d-7cbc-40b9-9659-f96954f73aed-kube-api-access-gdm4s\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.633257 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-utilities\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.633332 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-catalog-content\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.663143 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdm4s\" (UniqueName: \"kubernetes.io/projected/6aadd01d-7cbc-40b9-9659-f96954f73aed-kube-api-access-gdm4s\") pod \"certified-operators-d28w9\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:51 crc kubenswrapper[4822]: I1010 08:18:51.851031 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:18:52 crc kubenswrapper[4822]: I1010 08:18:52.309875 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d28w9"] Oct 10 08:18:53 crc kubenswrapper[4822]: I1010 08:18:53.283066 4822 generic.go:334] "Generic (PLEG): container finished" podID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerID="11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3" exitCode=0 Oct 10 08:18:53 crc kubenswrapper[4822]: I1010 08:18:53.283121 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerDied","Data":"11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3"} Oct 10 08:18:53 crc kubenswrapper[4822]: I1010 08:18:53.284575 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerStarted","Data":"675266ddd76bbd65ed13de7c4b5336ceaccfb0b649bc590d6b45dfd1242c5831"} Oct 10 08:18:53 crc kubenswrapper[4822]: I1010 08:18:53.286949 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:18:55 crc kubenswrapper[4822]: I1010 08:18:55.302975 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerStarted","Data":"ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b"} Oct 10 08:18:56 crc kubenswrapper[4822]: I1010 08:18:56.314454 4822 generic.go:334] "Generic (PLEG): container finished" podID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerID="ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b" exitCode=0 Oct 10 08:18:56 crc kubenswrapper[4822]: I1010 08:18:56.314509 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerDied","Data":"ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b"} Oct 10 08:18:57 crc kubenswrapper[4822]: I1010 08:18:57.327728 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerStarted","Data":"0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e"} Oct 10 08:18:57 crc kubenswrapper[4822]: I1010 08:18:57.358912 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d28w9" podStartSLOduration=2.757000349 podStartE2EDuration="6.358872856s" podCreationTimestamp="2025-10-10 08:18:51 +0000 UTC" firstStartedPulling="2025-10-10 08:18:53.286124684 +0000 UTC m=+6880.381282900" lastFinishedPulling="2025-10-10 08:18:56.887997201 +0000 UTC m=+6883.983155407" observedRunningTime="2025-10-10 08:18:57.347605361 +0000 UTC m=+6884.442763577" watchObservedRunningTime="2025-10-10 08:18:57.358872856 +0000 UTC m=+6884.454031062" Oct 10 08:19:01 crc kubenswrapper[4822]: I1010 08:19:01.337232 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:19:01 crc kubenswrapper[4822]: I1010 08:19:01.341423 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:19:01 crc kubenswrapper[4822]: I1010 08:19:01.852226 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:19:01 crc kubenswrapper[4822]: I1010 08:19:01.852291 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:19:01 crc kubenswrapper[4822]: I1010 08:19:01.917448 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:19:02 crc kubenswrapper[4822]: I1010 08:19:02.427621 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:19:02 crc kubenswrapper[4822]: I1010 08:19:02.482182 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d28w9"] Oct 10 08:19:04 crc kubenswrapper[4822]: I1010 08:19:04.397160 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d28w9" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="registry-server" containerID="cri-o://0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e" gracePeriod=2 Oct 10 08:19:04 crc kubenswrapper[4822]: I1010 08:19:04.901514 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.040327 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-utilities\") pod \"6aadd01d-7cbc-40b9-9659-f96954f73aed\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.040512 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdm4s\" (UniqueName: \"kubernetes.io/projected/6aadd01d-7cbc-40b9-9659-f96954f73aed-kube-api-access-gdm4s\") pod \"6aadd01d-7cbc-40b9-9659-f96954f73aed\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.040602 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-catalog-content\") pod \"6aadd01d-7cbc-40b9-9659-f96954f73aed\" (UID: \"6aadd01d-7cbc-40b9-9659-f96954f73aed\") " Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.042007 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-utilities" (OuterVolumeSpecName: "utilities") pod "6aadd01d-7cbc-40b9-9659-f96954f73aed" (UID: "6aadd01d-7cbc-40b9-9659-f96954f73aed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.048504 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aadd01d-7cbc-40b9-9659-f96954f73aed-kube-api-access-gdm4s" (OuterVolumeSpecName: "kube-api-access-gdm4s") pod "6aadd01d-7cbc-40b9-9659-f96954f73aed" (UID: "6aadd01d-7cbc-40b9-9659-f96954f73aed"). InnerVolumeSpecName "kube-api-access-gdm4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.133266 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6aadd01d-7cbc-40b9-9659-f96954f73aed" (UID: "6aadd01d-7cbc-40b9-9659-f96954f73aed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.143134 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.143159 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdm4s\" (UniqueName: \"kubernetes.io/projected/6aadd01d-7cbc-40b9-9659-f96954f73aed-kube-api-access-gdm4s\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.143170 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aadd01d-7cbc-40b9-9659-f96954f73aed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.408659 4822 generic.go:334] "Generic (PLEG): container finished" podID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerID="0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e" exitCode=0 Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.408700 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerDied","Data":"0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e"} Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.408730 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d28w9" event={"ID":"6aadd01d-7cbc-40b9-9659-f96954f73aed","Type":"ContainerDied","Data":"675266ddd76bbd65ed13de7c4b5336ceaccfb0b649bc590d6b45dfd1242c5831"} Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.408746 4822 scope.go:117] "RemoveContainer" containerID="0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.408890 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d28w9" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.436394 4822 scope.go:117] "RemoveContainer" containerID="ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.447860 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d28w9"] Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.457462 4822 scope.go:117] "RemoveContainer" containerID="11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.459828 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d28w9"] Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.508839 4822 scope.go:117] "RemoveContainer" containerID="0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e" Oct 10 08:19:05 crc kubenswrapper[4822]: E1010 08:19:05.509387 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e\": container with ID starting with 0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e not found: ID does not exist" containerID="0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.509432 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e"} err="failed to get container status \"0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e\": rpc error: code = NotFound desc = could not find container \"0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e\": container with ID starting with 0153e334c647e797b6424e304f54f8f7b77bae5a418a89e8e3eb493645d5561e not found: ID does not exist" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.509483 4822 scope.go:117] "RemoveContainer" containerID="ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b" Oct 10 08:19:05 crc kubenswrapper[4822]: E1010 08:19:05.510089 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b\": container with ID starting with ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b not found: ID does not exist" containerID="ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.510135 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b"} err="failed to get container status \"ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b\": rpc error: code = NotFound desc = could not find container \"ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b\": container with ID starting with ea189f8a3f048f4e311bb38c33decfe7ab3a9eb5681df3e808fcfdf4dc529a0b not found: ID does not exist" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.510162 4822 scope.go:117] "RemoveContainer" containerID="11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3" Oct 10 08:19:05 crc kubenswrapper[4822]: E1010 08:19:05.510544 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3\": container with ID starting with 11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3 not found: ID does not exist" containerID="11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.510674 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3"} err="failed to get container status \"11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3\": rpc error: code = NotFound desc = could not find container \"11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3\": container with ID starting with 11df1b88d02663a4a500de20e20626d5c569e9ad56b92f55e36d9ff097f63fb3 not found: ID does not exist" Oct 10 08:19:05 crc kubenswrapper[4822]: I1010 08:19:05.682333 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" path="/var/lib/kubelet/pods/6aadd01d-7cbc-40b9-9659-f96954f73aed/volumes" Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.336638 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.337218 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.337267 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.338107 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64afd18a7b7078060668666b54c7fb80d0d988c3b4f5d0d8e895230c1850c47c"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.338177 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://64afd18a7b7078060668666b54c7fb80d0d988c3b4f5d0d8e895230c1850c47c" gracePeriod=600 Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.719108 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="64afd18a7b7078060668666b54c7fb80d0d988c3b4f5d0d8e895230c1850c47c" exitCode=0 Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.719203 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"64afd18a7b7078060668666b54c7fb80d0d988c3b4f5d0d8e895230c1850c47c"} Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.719474 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293"} Oct 10 08:19:31 crc kubenswrapper[4822]: I1010 08:19:31.719499 4822 scope.go:117] "RemoveContainer" containerID="6c16645eb352cb90c24756180043ef347329ea739f11b2b2ac92d2408661f9ab" Oct 10 08:19:33 crc kubenswrapper[4822]: I1010 08:19:33.047895 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-mbhwj"] Oct 10 08:19:33 crc kubenswrapper[4822]: I1010 08:19:33.063215 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-mbhwj"] Oct 10 08:19:33 crc kubenswrapper[4822]: I1010 08:19:33.661875 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93d3b39-e873-4822-98ab-c02d72ffc7a1" path="/var/lib/kubelet/pods/b93d3b39-e873-4822-98ab-c02d72ffc7a1/volumes" Oct 10 08:19:37 crc kubenswrapper[4822]: I1010 08:19:37.861008 4822 scope.go:117] "RemoveContainer" containerID="0fd944e0d90dd2d6c89f9ef2a20c073169a61b490982190dcc56ca2c9a20f739" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.390417 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5t56m"] Oct 10 08:19:41 crc kubenswrapper[4822]: E1010 08:19:41.392081 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="extract-utilities" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.392102 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="extract-utilities" Oct 10 08:19:41 crc kubenswrapper[4822]: E1010 08:19:41.392112 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="extract-content" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.392119 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="extract-content" Oct 10 08:19:41 crc kubenswrapper[4822]: E1010 08:19:41.392147 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="registry-server" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.392155 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="registry-server" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.392454 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aadd01d-7cbc-40b9-9659-f96954f73aed" containerName="registry-server" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.394753 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.407050 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t56m"] Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.525312 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-catalog-content\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.525398 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4vbd\" (UniqueName: \"kubernetes.io/projected/b5c6b0db-f11f-469f-bbf6-e2c108e38508-kube-api-access-f4vbd\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.525589 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-utilities\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.627596 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-utilities\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.628015 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-catalog-content\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.628078 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4vbd\" (UniqueName: \"kubernetes.io/projected/b5c6b0db-f11f-469f-bbf6-e2c108e38508-kube-api-access-f4vbd\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.628203 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-utilities\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.628417 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-catalog-content\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.648909 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4vbd\" (UniqueName: \"kubernetes.io/projected/b5c6b0db-f11f-469f-bbf6-e2c108e38508-kube-api-access-f4vbd\") pod \"community-operators-5t56m\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:41 crc kubenswrapper[4822]: I1010 08:19:41.739187 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:42 crc kubenswrapper[4822]: I1010 08:19:42.297051 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t56m"] Oct 10 08:19:42 crc kubenswrapper[4822]: I1010 08:19:42.836290 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerID="3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9" exitCode=0 Oct 10 08:19:42 crc kubenswrapper[4822]: I1010 08:19:42.836399 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerDied","Data":"3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9"} Oct 10 08:19:42 crc kubenswrapper[4822]: I1010 08:19:42.836590 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerStarted","Data":"eaab8855fd590fa48501bf9b6ae269f24790818885e5ad7c2e033813c1000526"} Oct 10 08:19:43 crc kubenswrapper[4822]: I1010 08:19:43.860357 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerStarted","Data":"5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217"} Oct 10 08:19:44 crc kubenswrapper[4822]: I1010 08:19:44.074076 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-520f-account-create-lmc9m"] Oct 10 08:19:44 crc kubenswrapper[4822]: I1010 08:19:44.088447 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-520f-account-create-lmc9m"] Oct 10 08:19:45 crc kubenswrapper[4822]: I1010 08:19:45.691551 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30dfa431-47a6-485e-b4df-db5ff05df8e4" path="/var/lib/kubelet/pods/30dfa431-47a6-485e-b4df-db5ff05df8e4/volumes" Oct 10 08:19:45 crc kubenswrapper[4822]: I1010 08:19:45.883659 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerID="5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217" exitCode=0 Oct 10 08:19:45 crc kubenswrapper[4822]: I1010 08:19:45.883707 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerDied","Data":"5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217"} Oct 10 08:19:46 crc kubenswrapper[4822]: I1010 08:19:46.895673 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerStarted","Data":"620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40"} Oct 10 08:19:46 crc kubenswrapper[4822]: I1010 08:19:46.924302 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5t56m" podStartSLOduration=2.434973608 podStartE2EDuration="5.92428241s" podCreationTimestamp="2025-10-10 08:19:41 +0000 UTC" firstStartedPulling="2025-10-10 08:19:42.837836554 +0000 UTC m=+6929.932994750" lastFinishedPulling="2025-10-10 08:19:46.327145356 +0000 UTC m=+6933.422303552" observedRunningTime="2025-10-10 08:19:46.921715466 +0000 UTC m=+6934.016873672" watchObservedRunningTime="2025-10-10 08:19:46.92428241 +0000 UTC m=+6934.019440596" Oct 10 08:19:51 crc kubenswrapper[4822]: I1010 08:19:51.739369 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:51 crc kubenswrapper[4822]: I1010 08:19:51.740021 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:19:52 crc kubenswrapper[4822]: I1010 08:19:52.810782 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5t56m" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="registry-server" probeResult="failure" output=< Oct 10 08:19:52 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:19:52 crc kubenswrapper[4822]: > Oct 10 08:19:55 crc kubenswrapper[4822]: I1010 08:19:55.033134 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-zvqf5"] Oct 10 08:19:55 crc kubenswrapper[4822]: I1010 08:19:55.040430 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-zvqf5"] Oct 10 08:19:55 crc kubenswrapper[4822]: I1010 08:19:55.666060 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4a48ef-7c7b-460b-b968-8bdb160b3fa0" path="/var/lib/kubelet/pods/1c4a48ef-7c7b-460b-b968-8bdb160b3fa0/volumes" Oct 10 08:20:01 crc kubenswrapper[4822]: I1010 08:20:01.790152 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:20:01 crc kubenswrapper[4822]: I1010 08:20:01.842563 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:20:02 crc kubenswrapper[4822]: I1010 08:20:02.030312 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t56m"] Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.058016 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5t56m" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="registry-server" containerID="cri-o://620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40" gracePeriod=2 Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.588265 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.725700 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4vbd\" (UniqueName: \"kubernetes.io/projected/b5c6b0db-f11f-469f-bbf6-e2c108e38508-kube-api-access-f4vbd\") pod \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.725889 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-utilities\") pod \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.725937 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-catalog-content\") pod \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\" (UID: \"b5c6b0db-f11f-469f-bbf6-e2c108e38508\") " Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.727089 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-utilities" (OuterVolumeSpecName: "utilities") pod "b5c6b0db-f11f-469f-bbf6-e2c108e38508" (UID: "b5c6b0db-f11f-469f-bbf6-e2c108e38508"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.728182 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.733356 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c6b0db-f11f-469f-bbf6-e2c108e38508-kube-api-access-f4vbd" (OuterVolumeSpecName: "kube-api-access-f4vbd") pod "b5c6b0db-f11f-469f-bbf6-e2c108e38508" (UID: "b5c6b0db-f11f-469f-bbf6-e2c108e38508"). InnerVolumeSpecName "kube-api-access-f4vbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.787838 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5c6b0db-f11f-469f-bbf6-e2c108e38508" (UID: "b5c6b0db-f11f-469f-bbf6-e2c108e38508"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.830949 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c6b0db-f11f-469f-bbf6-e2c108e38508-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:03 crc kubenswrapper[4822]: I1010 08:20:03.830995 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4vbd\" (UniqueName: \"kubernetes.io/projected/b5c6b0db-f11f-469f-bbf6-e2c108e38508-kube-api-access-f4vbd\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.074563 4822 generic.go:334] "Generic (PLEG): container finished" podID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerID="620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40" exitCode=0 Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.074667 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerDied","Data":"620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40"} Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.074688 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t56m" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.074968 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t56m" event={"ID":"b5c6b0db-f11f-469f-bbf6-e2c108e38508","Type":"ContainerDied","Data":"eaab8855fd590fa48501bf9b6ae269f24790818885e5ad7c2e033813c1000526"} Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.074985 4822 scope.go:117] "RemoveContainer" containerID="620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.118175 4822 scope.go:117] "RemoveContainer" containerID="5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.142940 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t56m"] Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.160484 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5t56m"] Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.161232 4822 scope.go:117] "RemoveContainer" containerID="3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.230577 4822 scope.go:117] "RemoveContainer" containerID="620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40" Oct 10 08:20:04 crc kubenswrapper[4822]: E1010 08:20:04.231075 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40\": container with ID starting with 620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40 not found: ID does not exist" containerID="620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.231106 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40"} err="failed to get container status \"620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40\": rpc error: code = NotFound desc = could not find container \"620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40\": container with ID starting with 620c57373b3d9ca7ae17c5687e2bd084b56e0e8eeda8fec8b402c8976cfa2f40 not found: ID does not exist" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.231129 4822 scope.go:117] "RemoveContainer" containerID="5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217" Oct 10 08:20:04 crc kubenswrapper[4822]: E1010 08:20:04.231637 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217\": container with ID starting with 5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217 not found: ID does not exist" containerID="5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.231660 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217"} err="failed to get container status \"5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217\": rpc error: code = NotFound desc = could not find container \"5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217\": container with ID starting with 5961f6848c39ca9d8f653dc18d3f5797dbe57da017353154660d5e2793fc7217 not found: ID does not exist" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.231672 4822 scope.go:117] "RemoveContainer" containerID="3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9" Oct 10 08:20:04 crc kubenswrapper[4822]: E1010 08:20:04.231920 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9\": container with ID starting with 3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9 not found: ID does not exist" containerID="3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9" Oct 10 08:20:04 crc kubenswrapper[4822]: I1010 08:20:04.231980 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9"} err="failed to get container status \"3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9\": rpc error: code = NotFound desc = could not find container \"3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9\": container with ID starting with 3cc38a510bed30bf646bddb3977e422823fdf1ac370d40b72213e2617732bee9 not found: ID does not exist" Oct 10 08:20:05 crc kubenswrapper[4822]: I1010 08:20:05.663387 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" path="/var/lib/kubelet/pods/b5c6b0db-f11f-469f-bbf6-e2c108e38508/volumes" Oct 10 08:20:24 crc kubenswrapper[4822]: I1010 08:20:24.050207 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-9fn4l"] Oct 10 08:20:24 crc kubenswrapper[4822]: I1010 08:20:24.068747 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-9fn4l"] Oct 10 08:20:25 crc kubenswrapper[4822]: I1010 08:20:25.669908 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130de137-8831-4c2b-9188-aea0f9950b9e" path="/var/lib/kubelet/pods/130de137-8831-4c2b-9188-aea0f9950b9e/volumes" Oct 10 08:20:34 crc kubenswrapper[4822]: I1010 08:20:34.048567 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-730f-account-create-2llzq"] Oct 10 08:20:34 crc kubenswrapper[4822]: I1010 08:20:34.057965 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-730f-account-create-2llzq"] Oct 10 08:20:35 crc kubenswrapper[4822]: I1010 08:20:35.673717 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada20cfb-1fd7-4710-bcd3-f105fed432fa" path="/var/lib/kubelet/pods/ada20cfb-1fd7-4710-bcd3-f105fed432fa/volumes" Oct 10 08:20:37 crc kubenswrapper[4822]: I1010 08:20:37.955358 4822 scope.go:117] "RemoveContainer" containerID="759e6df25a0e08c62e8ce28baa854098a8c3859f8a036b91e2380376fc92de24" Oct 10 08:20:37 crc kubenswrapper[4822]: I1010 08:20:37.997037 4822 scope.go:117] "RemoveContainer" containerID="fa194e3858649ca2a048f6f1e4669480bc87e2b381ea8f2927dbbd133b4a7fe4" Oct 10 08:20:38 crc kubenswrapper[4822]: I1010 08:20:38.067986 4822 scope.go:117] "RemoveContainer" containerID="7c51ae10045f0c1a227389a6451799108c35c721779dce89c46b50bef8a7a4d7" Oct 10 08:20:38 crc kubenswrapper[4822]: I1010 08:20:38.124694 4822 scope.go:117] "RemoveContainer" containerID="8385f783def928f15d69fcc41b02fda83ad97d1a3db75bba1e05365a9f30141e" Oct 10 08:20:45 crc kubenswrapper[4822]: I1010 08:20:45.041017 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-6jcn9"] Oct 10 08:20:45 crc kubenswrapper[4822]: I1010 08:20:45.056139 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-6jcn9"] Oct 10 08:20:45 crc kubenswrapper[4822]: I1010 08:20:45.665475 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40b9ac3-db63-4902-8b68-ba81adf704f9" path="/var/lib/kubelet/pods/c40b9ac3-db63-4902-8b68-ba81adf704f9/volumes" Oct 10 08:21:31 crc kubenswrapper[4822]: I1010 08:21:31.337377 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:21:31 crc kubenswrapper[4822]: I1010 08:21:31.338113 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:21:38 crc kubenswrapper[4822]: I1010 08:21:38.277672 4822 scope.go:117] "RemoveContainer" containerID="a7455e852761aac14f1f9b4a302a731e4c0c6848ca81da2758cefa26b7677809" Oct 10 08:22:01 crc kubenswrapper[4822]: I1010 08:22:01.336324 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:22:01 crc kubenswrapper[4822]: I1010 08:22:01.336879 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.336905 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.337704 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.337798 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.338939 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.339037 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" gracePeriod=600 Oct 10 08:22:31 crc kubenswrapper[4822]: E1010 08:22:31.468112 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.663629 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" exitCode=0 Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.666420 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293"} Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.666563 4822 scope.go:117] "RemoveContainer" containerID="64afd18a7b7078060668666b54c7fb80d0d988c3b4f5d0d8e895230c1850c47c" Oct 10 08:22:31 crc kubenswrapper[4822]: I1010 08:22:31.667396 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:22:31 crc kubenswrapper[4822]: E1010 08:22:31.667840 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:22:46 crc kubenswrapper[4822]: I1010 08:22:46.650046 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:22:46 crc kubenswrapper[4822]: E1010 08:22:46.650789 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:22:58 crc kubenswrapper[4822]: I1010 08:22:58.651413 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:22:58 crc kubenswrapper[4822]: E1010 08:22:58.652480 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:23:10 crc kubenswrapper[4822]: I1010 08:23:10.651107 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:23:10 crc kubenswrapper[4822]: E1010 08:23:10.652542 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:23:25 crc kubenswrapper[4822]: I1010 08:23:25.650915 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:23:25 crc kubenswrapper[4822]: E1010 08:23:25.652072 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:23:39 crc kubenswrapper[4822]: I1010 08:23:39.650932 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:23:39 crc kubenswrapper[4822]: E1010 08:23:39.652334 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:23:44 crc kubenswrapper[4822]: I1010 08:23:44.404993 4822 generic.go:334] "Generic (PLEG): container finished" podID="cb5a26a4-8ae3-4b96-a905-70be164e9198" containerID="93995f98556f396cdf49bdf8ef00217a74ebcc70e5a36c7ebc867cc73d6635bb" exitCode=0 Oct 10 08:23:44 crc kubenswrapper[4822]: I1010 08:23:44.405088 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" event={"ID":"cb5a26a4-8ae3-4b96-a905-70be164e9198","Type":"ContainerDied","Data":"93995f98556f396cdf49bdf8ef00217a74ebcc70e5a36c7ebc867cc73d6635bb"} Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.852573 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.891440 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdzvr\" (UniqueName: \"kubernetes.io/projected/cb5a26a4-8ae3-4b96-a905-70be164e9198-kube-api-access-cdzvr\") pod \"cb5a26a4-8ae3-4b96-a905-70be164e9198\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.891491 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-tripleo-cleanup-combined-ca-bundle\") pod \"cb5a26a4-8ae3-4b96-a905-70be164e9198\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.891530 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-inventory\") pod \"cb5a26a4-8ae3-4b96-a905-70be164e9198\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.891563 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ceph\") pod \"cb5a26a4-8ae3-4b96-a905-70be164e9198\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.891714 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ssh-key\") pod \"cb5a26a4-8ae3-4b96-a905-70be164e9198\" (UID: \"cb5a26a4-8ae3-4b96-a905-70be164e9198\") " Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.899792 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "cb5a26a4-8ae3-4b96-a905-70be164e9198" (UID: "cb5a26a4-8ae3-4b96-a905-70be164e9198"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.900498 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5a26a4-8ae3-4b96-a905-70be164e9198-kube-api-access-cdzvr" (OuterVolumeSpecName: "kube-api-access-cdzvr") pod "cb5a26a4-8ae3-4b96-a905-70be164e9198" (UID: "cb5a26a4-8ae3-4b96-a905-70be164e9198"). InnerVolumeSpecName "kube-api-access-cdzvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.901376 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ceph" (OuterVolumeSpecName: "ceph") pod "cb5a26a4-8ae3-4b96-a905-70be164e9198" (UID: "cb5a26a4-8ae3-4b96-a905-70be164e9198"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.927915 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cb5a26a4-8ae3-4b96-a905-70be164e9198" (UID: "cb5a26a4-8ae3-4b96-a905-70be164e9198"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.933019 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-inventory" (OuterVolumeSpecName: "inventory") pod "cb5a26a4-8ae3-4b96-a905-70be164e9198" (UID: "cb5a26a4-8ae3-4b96-a905-70be164e9198"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.995293 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdzvr\" (UniqueName: \"kubernetes.io/projected/cb5a26a4-8ae3-4b96-a905-70be164e9198-kube-api-access-cdzvr\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.995339 4822 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.995352 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.995361 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:45 crc kubenswrapper[4822]: I1010 08:23:45.995372 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb5a26a4-8ae3-4b96-a905-70be164e9198-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:46 crc kubenswrapper[4822]: I1010 08:23:46.428637 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" event={"ID":"cb5a26a4-8ae3-4b96-a905-70be164e9198","Type":"ContainerDied","Data":"e7b8d53e203b0fddbbfd7ae8dd01fc60c94c040f1a1ce9772faabe58e5a5360f"} Oct 10 08:23:46 crc kubenswrapper[4822]: I1010 08:23:46.429040 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b8d53e203b0fddbbfd7ae8dd01fc60c94c040f1a1ce9772faabe58e5a5360f" Oct 10 08:23:46 crc kubenswrapper[4822]: I1010 08:23:46.428839 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.531236 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zxpzn"] Oct 10 08:23:50 crc kubenswrapper[4822]: E1010 08:23:50.533047 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="registry-server" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.533085 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="registry-server" Oct 10 08:23:50 crc kubenswrapper[4822]: E1010 08:23:50.533114 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="extract-content" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.533133 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="extract-content" Oct 10 08:23:50 crc kubenswrapper[4822]: E1010 08:23:50.533184 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5a26a4-8ae3-4b96-a905-70be164e9198" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.533207 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5a26a4-8ae3-4b96-a905-70be164e9198" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 10 08:23:50 crc kubenswrapper[4822]: E1010 08:23:50.533259 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="extract-utilities" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.533277 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="extract-utilities" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.533867 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c6b0db-f11f-469f-bbf6-e2c108e38508" containerName="registry-server" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.533921 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5a26a4-8ae3-4b96-a905-70be164e9198" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.535760 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.539651 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.539681 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.539774 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.541448 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.541927 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zxpzn"] Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.715238 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-inventory\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.715596 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.716219 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmbm2\" (UniqueName: \"kubernetes.io/projected/35c21f6c-c1c3-4191-b721-ac63a25495ab-kube-api-access-qmbm2\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.716312 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ceph\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.716485 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.818773 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmbm2\" (UniqueName: \"kubernetes.io/projected/35c21f6c-c1c3-4191-b721-ac63a25495ab-kube-api-access-qmbm2\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.818847 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ceph\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.818923 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.818993 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-inventory\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.819042 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.825759 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.826147 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.826991 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-inventory\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.827126 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ceph\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.844171 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmbm2\" (UniqueName: \"kubernetes.io/projected/35c21f6c-c1c3-4191-b721-ac63a25495ab-kube-api-access-qmbm2\") pod \"bootstrap-openstack-openstack-cell1-zxpzn\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:50 crc kubenswrapper[4822]: I1010 08:23:50.867126 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:23:51 crc kubenswrapper[4822]: I1010 08:23:51.420634 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zxpzn"] Oct 10 08:23:51 crc kubenswrapper[4822]: I1010 08:23:51.489550 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" event={"ID":"35c21f6c-c1c3-4191-b721-ac63a25495ab","Type":"ContainerStarted","Data":"876b9587b35ac1925f45112ec8c85efb9ff54f0557b27a911912d9a5d6305333"} Oct 10 08:23:52 crc kubenswrapper[4822]: I1010 08:23:52.500253 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" event={"ID":"35c21f6c-c1c3-4191-b721-ac63a25495ab","Type":"ContainerStarted","Data":"5cceeab8be930193d12ff8fb744ae6c1e993ebf5137eea9c3bfeb3e89b10299b"} Oct 10 08:23:52 crc kubenswrapper[4822]: I1010 08:23:52.524633 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" podStartSLOduration=2.127893727 podStartE2EDuration="2.524613518s" podCreationTimestamp="2025-10-10 08:23:50 +0000 UTC" firstStartedPulling="2025-10-10 08:23:51.42707846 +0000 UTC m=+7178.522236696" lastFinishedPulling="2025-10-10 08:23:51.823798271 +0000 UTC m=+7178.918956487" observedRunningTime="2025-10-10 08:23:52.52016193 +0000 UTC m=+7179.615320166" watchObservedRunningTime="2025-10-10 08:23:52.524613518 +0000 UTC m=+7179.619771724" Oct 10 08:23:52 crc kubenswrapper[4822]: I1010 08:23:52.651608 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:23:52 crc kubenswrapper[4822]: E1010 08:23:52.651954 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:24:05 crc kubenswrapper[4822]: I1010 08:24:05.654695 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:24:05 crc kubenswrapper[4822]: E1010 08:24:05.656100 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:24:17 crc kubenswrapper[4822]: I1010 08:24:17.650254 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:24:17 crc kubenswrapper[4822]: E1010 08:24:17.650995 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:24:29 crc kubenswrapper[4822]: I1010 08:24:29.650562 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:24:29 crc kubenswrapper[4822]: E1010 08:24:29.651864 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.794161 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zdx75"] Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.797931 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.804236 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdx75"] Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.840713 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4sh\" (UniqueName: \"kubernetes.io/projected/defe3979-e91f-4e98-86bb-1d34a7fa192d-kube-api-access-rg4sh\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.840829 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-catalog-content\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.841005 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-utilities\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.943429 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-utilities\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.943982 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4sh\" (UniqueName: \"kubernetes.io/projected/defe3979-e91f-4e98-86bb-1d34a7fa192d-kube-api-access-rg4sh\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.944139 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-catalog-content\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.944169 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-utilities\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.944650 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-catalog-content\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:41 crc kubenswrapper[4822]: I1010 08:24:41.966064 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4sh\" (UniqueName: \"kubernetes.io/projected/defe3979-e91f-4e98-86bb-1d34a7fa192d-kube-api-access-rg4sh\") pod \"redhat-marketplace-zdx75\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:42 crc kubenswrapper[4822]: I1010 08:24:42.136835 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:42 crc kubenswrapper[4822]: I1010 08:24:42.620133 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdx75"] Oct 10 08:24:42 crc kubenswrapper[4822]: I1010 08:24:42.653134 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:24:42 crc kubenswrapper[4822]: E1010 08:24:42.653949 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:24:43 crc kubenswrapper[4822]: I1010 08:24:43.093839 4822 generic.go:334] "Generic (PLEG): container finished" podID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerID="29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a" exitCode=0 Oct 10 08:24:43 crc kubenswrapper[4822]: I1010 08:24:43.093961 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerDied","Data":"29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a"} Oct 10 08:24:43 crc kubenswrapper[4822]: I1010 08:24:43.094256 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerStarted","Data":"3173a2aff6874e084fbdb480ee06b0c05ba5c32d4ac2baa3b93f6d08354890b9"} Oct 10 08:24:43 crc kubenswrapper[4822]: I1010 08:24:43.097232 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:24:44 crc kubenswrapper[4822]: I1010 08:24:44.106127 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerStarted","Data":"3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a"} Oct 10 08:24:45 crc kubenswrapper[4822]: I1010 08:24:45.118501 4822 generic.go:334] "Generic (PLEG): container finished" podID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerID="3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a" exitCode=0 Oct 10 08:24:45 crc kubenswrapper[4822]: I1010 08:24:45.118571 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerDied","Data":"3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a"} Oct 10 08:24:46 crc kubenswrapper[4822]: I1010 08:24:46.132708 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerStarted","Data":"f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66"} Oct 10 08:24:46 crc kubenswrapper[4822]: I1010 08:24:46.154993 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zdx75" podStartSLOduration=2.717245838 podStartE2EDuration="5.154977412s" podCreationTimestamp="2025-10-10 08:24:41 +0000 UTC" firstStartedPulling="2025-10-10 08:24:43.096913786 +0000 UTC m=+7230.192071992" lastFinishedPulling="2025-10-10 08:24:45.53464536 +0000 UTC m=+7232.629803566" observedRunningTime="2025-10-10 08:24:46.151611595 +0000 UTC m=+7233.246769821" watchObservedRunningTime="2025-10-10 08:24:46.154977412 +0000 UTC m=+7233.250135608" Oct 10 08:24:52 crc kubenswrapper[4822]: I1010 08:24:52.138239 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:52 crc kubenswrapper[4822]: I1010 08:24:52.138767 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:52 crc kubenswrapper[4822]: I1010 08:24:52.226364 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:52 crc kubenswrapper[4822]: I1010 08:24:52.290275 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:52 crc kubenswrapper[4822]: I1010 08:24:52.473006 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdx75"] Oct 10 08:24:53 crc kubenswrapper[4822]: I1010 08:24:53.659571 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:24:53 crc kubenswrapper[4822]: E1010 08:24:53.659874 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.226642 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zdx75" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="registry-server" containerID="cri-o://f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66" gracePeriod=2 Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.700656 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.798298 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-utilities\") pod \"defe3979-e91f-4e98-86bb-1d34a7fa192d\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.798825 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-catalog-content\") pod \"defe3979-e91f-4e98-86bb-1d34a7fa192d\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.798907 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4sh\" (UniqueName: \"kubernetes.io/projected/defe3979-e91f-4e98-86bb-1d34a7fa192d-kube-api-access-rg4sh\") pod \"defe3979-e91f-4e98-86bb-1d34a7fa192d\" (UID: \"defe3979-e91f-4e98-86bb-1d34a7fa192d\") " Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.799438 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-utilities" (OuterVolumeSpecName: "utilities") pod "defe3979-e91f-4e98-86bb-1d34a7fa192d" (UID: "defe3979-e91f-4e98-86bb-1d34a7fa192d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.807123 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defe3979-e91f-4e98-86bb-1d34a7fa192d-kube-api-access-rg4sh" (OuterVolumeSpecName: "kube-api-access-rg4sh") pod "defe3979-e91f-4e98-86bb-1d34a7fa192d" (UID: "defe3979-e91f-4e98-86bb-1d34a7fa192d"). InnerVolumeSpecName "kube-api-access-rg4sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.811095 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "defe3979-e91f-4e98-86bb-1d34a7fa192d" (UID: "defe3979-e91f-4e98-86bb-1d34a7fa192d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.901239 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.901278 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defe3979-e91f-4e98-86bb-1d34a7fa192d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:54 crc kubenswrapper[4822]: I1010 08:24:54.901292 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4sh\" (UniqueName: \"kubernetes.io/projected/defe3979-e91f-4e98-86bb-1d34a7fa192d-kube-api-access-rg4sh\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.239271 4822 generic.go:334] "Generic (PLEG): container finished" podID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerID="f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66" exitCode=0 Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.239315 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerDied","Data":"f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66"} Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.239354 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdx75" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.239384 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdx75" event={"ID":"defe3979-e91f-4e98-86bb-1d34a7fa192d","Type":"ContainerDied","Data":"3173a2aff6874e084fbdb480ee06b0c05ba5c32d4ac2baa3b93f6d08354890b9"} Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.239408 4822 scope.go:117] "RemoveContainer" containerID="f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.272624 4822 scope.go:117] "RemoveContainer" containerID="3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.287204 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdx75"] Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.298730 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdx75"] Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.309091 4822 scope.go:117] "RemoveContainer" containerID="29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.358324 4822 scope.go:117] "RemoveContainer" containerID="f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66" Oct 10 08:24:55 crc kubenswrapper[4822]: E1010 08:24:55.359110 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66\": container with ID starting with f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66 not found: ID does not exist" containerID="f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.359148 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66"} err="failed to get container status \"f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66\": rpc error: code = NotFound desc = could not find container \"f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66\": container with ID starting with f0f6e8c1245594da2aaed79d45b1a2f51d92f69583be23cc7dd0fc0111f30a66 not found: ID does not exist" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.359176 4822 scope.go:117] "RemoveContainer" containerID="3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a" Oct 10 08:24:55 crc kubenswrapper[4822]: E1010 08:24:55.359685 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a\": container with ID starting with 3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a not found: ID does not exist" containerID="3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.359729 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a"} err="failed to get container status \"3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a\": rpc error: code = NotFound desc = could not find container \"3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a\": container with ID starting with 3bdb8ee350848cace907df416cbba754053e28603a4c61f040a803f8bb21551a not found: ID does not exist" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.359762 4822 scope.go:117] "RemoveContainer" containerID="29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a" Oct 10 08:24:55 crc kubenswrapper[4822]: E1010 08:24:55.360141 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a\": container with ID starting with 29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a not found: ID does not exist" containerID="29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.360178 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a"} err="failed to get container status \"29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a\": rpc error: code = NotFound desc = could not find container \"29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a\": container with ID starting with 29c0ccc3ea6ca6a20801b2cc725d496540f9b25aa94fdc45dbadf8ed325e804a not found: ID does not exist" Oct 10 08:24:55 crc kubenswrapper[4822]: I1010 08:24:55.665381 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" path="/var/lib/kubelet/pods/defe3979-e91f-4e98-86bb-1d34a7fa192d/volumes" Oct 10 08:25:05 crc kubenswrapper[4822]: I1010 08:25:05.650741 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:25:05 crc kubenswrapper[4822]: E1010 08:25:05.651645 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:25:20 crc kubenswrapper[4822]: I1010 08:25:20.651171 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:25:20 crc kubenswrapper[4822]: E1010 08:25:20.652106 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:25:33 crc kubenswrapper[4822]: I1010 08:25:33.659864 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:25:33 crc kubenswrapper[4822]: E1010 08:25:33.662157 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:25:46 crc kubenswrapper[4822]: I1010 08:25:46.651631 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:25:46 crc kubenswrapper[4822]: E1010 08:25:46.653940 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:26:01 crc kubenswrapper[4822]: I1010 08:26:01.650560 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:26:01 crc kubenswrapper[4822]: E1010 08:26:01.651390 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:26:13 crc kubenswrapper[4822]: I1010 08:26:13.661722 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:26:13 crc kubenswrapper[4822]: E1010 08:26:13.665674 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:26:24 crc kubenswrapper[4822]: I1010 08:26:24.651013 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:26:24 crc kubenswrapper[4822]: E1010 08:26:24.651994 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:26:39 crc kubenswrapper[4822]: I1010 08:26:39.651300 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:26:39 crc kubenswrapper[4822]: E1010 08:26:39.652514 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:26:51 crc kubenswrapper[4822]: I1010 08:26:51.652548 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:26:51 crc kubenswrapper[4822]: E1010 08:26:51.653607 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:27:02 crc kubenswrapper[4822]: I1010 08:27:02.514742 4822 generic.go:334] "Generic (PLEG): container finished" podID="35c21f6c-c1c3-4191-b721-ac63a25495ab" containerID="5cceeab8be930193d12ff8fb744ae6c1e993ebf5137eea9c3bfeb3e89b10299b" exitCode=0 Oct 10 08:27:02 crc kubenswrapper[4822]: I1010 08:27:02.514881 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" event={"ID":"35c21f6c-c1c3-4191-b721-ac63a25495ab","Type":"ContainerDied","Data":"5cceeab8be930193d12ff8fb744ae6c1e993ebf5137eea9c3bfeb3e89b10299b"} Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.052485 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.126104 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-bootstrap-combined-ca-bundle\") pod \"35c21f6c-c1c3-4191-b721-ac63a25495ab\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.126215 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmbm2\" (UniqueName: \"kubernetes.io/projected/35c21f6c-c1c3-4191-b721-ac63a25495ab-kube-api-access-qmbm2\") pod \"35c21f6c-c1c3-4191-b721-ac63a25495ab\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.126277 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-inventory\") pod \"35c21f6c-c1c3-4191-b721-ac63a25495ab\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.126332 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ssh-key\") pod \"35c21f6c-c1c3-4191-b721-ac63a25495ab\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.126437 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ceph\") pod \"35c21f6c-c1c3-4191-b721-ac63a25495ab\" (UID: \"35c21f6c-c1c3-4191-b721-ac63a25495ab\") " Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.132345 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "35c21f6c-c1c3-4191-b721-ac63a25495ab" (UID: "35c21f6c-c1c3-4191-b721-ac63a25495ab"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.133124 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c21f6c-c1c3-4191-b721-ac63a25495ab-kube-api-access-qmbm2" (OuterVolumeSpecName: "kube-api-access-qmbm2") pod "35c21f6c-c1c3-4191-b721-ac63a25495ab" (UID: "35c21f6c-c1c3-4191-b721-ac63a25495ab"). InnerVolumeSpecName "kube-api-access-qmbm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.140158 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ceph" (OuterVolumeSpecName: "ceph") pod "35c21f6c-c1c3-4191-b721-ac63a25495ab" (UID: "35c21f6c-c1c3-4191-b721-ac63a25495ab"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.159339 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35c21f6c-c1c3-4191-b721-ac63a25495ab" (UID: "35c21f6c-c1c3-4191-b721-ac63a25495ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.170489 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-inventory" (OuterVolumeSpecName: "inventory") pod "35c21f6c-c1c3-4191-b721-ac63a25495ab" (UID: "35c21f6c-c1c3-4191-b721-ac63a25495ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.229600 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.229653 4822 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.229672 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmbm2\" (UniqueName: \"kubernetes.io/projected/35c21f6c-c1c3-4191-b721-ac63a25495ab-kube-api-access-qmbm2\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.229683 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.229695 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35c21f6c-c1c3-4191-b721-ac63a25495ab-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.551497 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" event={"ID":"35c21f6c-c1c3-4191-b721-ac63a25495ab","Type":"ContainerDied","Data":"876b9587b35ac1925f45112ec8c85efb9ff54f0557b27a911912d9a5d6305333"} Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.551865 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="876b9587b35ac1925f45112ec8c85efb9ff54f0557b27a911912d9a5d6305333" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.551735 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zxpzn" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.627244 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-rsv2k"] Oct 10 08:27:04 crc kubenswrapper[4822]: E1010 08:27:04.628097 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="registry-server" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.628133 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="registry-server" Oct 10 08:27:04 crc kubenswrapper[4822]: E1010 08:27:04.628160 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="extract-utilities" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.628172 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="extract-utilities" Oct 10 08:27:04 crc kubenswrapper[4822]: E1010 08:27:04.628210 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="extract-content" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.628222 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="extract-content" Oct 10 08:27:04 crc kubenswrapper[4822]: E1010 08:27:04.628257 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c21f6c-c1c3-4191-b721-ac63a25495ab" containerName="bootstrap-openstack-openstack-cell1" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.628270 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c21f6c-c1c3-4191-b721-ac63a25495ab" containerName="bootstrap-openstack-openstack-cell1" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.628642 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c21f6c-c1c3-4191-b721-ac63a25495ab" containerName="bootstrap-openstack-openstack-cell1" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.628682 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="defe3979-e91f-4e98-86bb-1d34a7fa192d" containerName="registry-server" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.629911 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.634782 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.636930 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.637098 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.637520 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.684583 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-rsv2k"] Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.737966 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ceph\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.738018 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n54c5\" (UniqueName: \"kubernetes.io/projected/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-kube-api-access-n54c5\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.738050 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ssh-key\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.738132 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-inventory\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.839736 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ceph\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.839831 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n54c5\" (UniqueName: \"kubernetes.io/projected/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-kube-api-access-n54c5\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.839866 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ssh-key\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.839910 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-inventory\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.844371 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-inventory\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.844625 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ssh-key\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.844726 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ceph\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.857487 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n54c5\" (UniqueName: \"kubernetes.io/projected/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-kube-api-access-n54c5\") pod \"download-cache-openstack-openstack-cell1-rsv2k\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:04 crc kubenswrapper[4822]: I1010 08:27:04.960154 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:27:05 crc kubenswrapper[4822]: I1010 08:27:05.520456 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-rsv2k"] Oct 10 08:27:05 crc kubenswrapper[4822]: I1010 08:27:05.566844 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" event={"ID":"7482993c-2c6c-4ad2-9891-c6eaf07b76e3","Type":"ContainerStarted","Data":"18f85dd7aa5d3fbd1546490652188552cd78414fe0be0f54cb5649cb6cdffb56"} Oct 10 08:27:06 crc kubenswrapper[4822]: I1010 08:27:06.579752 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" event={"ID":"7482993c-2c6c-4ad2-9891-c6eaf07b76e3","Type":"ContainerStarted","Data":"75fc00ce4807e94d4c6409563b5400c7f3764a9cf7fbb03e130aeb8ae52af991"} Oct 10 08:27:06 crc kubenswrapper[4822]: I1010 08:27:06.596177 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" podStartSLOduration=2.08673896 podStartE2EDuration="2.596151072s" podCreationTimestamp="2025-10-10 08:27:04 +0000 UTC" firstStartedPulling="2025-10-10 08:27:05.527186876 +0000 UTC m=+7372.622345072" lastFinishedPulling="2025-10-10 08:27:06.036598998 +0000 UTC m=+7373.131757184" observedRunningTime="2025-10-10 08:27:06.594448413 +0000 UTC m=+7373.689606629" watchObservedRunningTime="2025-10-10 08:27:06.596151072 +0000 UTC m=+7373.691309258" Oct 10 08:27:06 crc kubenswrapper[4822]: I1010 08:27:06.650961 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:27:06 crc kubenswrapper[4822]: E1010 08:27:06.651343 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:27:19 crc kubenswrapper[4822]: I1010 08:27:19.650674 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:27:19 crc kubenswrapper[4822]: E1010 08:27:19.651738 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:27:32 crc kubenswrapper[4822]: I1010 08:27:32.650634 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:27:33 crc kubenswrapper[4822]: I1010 08:27:33.851314 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"256f45c7ea0d8da7f0201183c607d7f15e72b8de5f8d9482c782fb16d58eaeb7"} Oct 10 08:28:35 crc kubenswrapper[4822]: I1010 08:28:35.566141 4822 generic.go:334] "Generic (PLEG): container finished" podID="7482993c-2c6c-4ad2-9891-c6eaf07b76e3" containerID="75fc00ce4807e94d4c6409563b5400c7f3764a9cf7fbb03e130aeb8ae52af991" exitCode=0 Oct 10 08:28:35 crc kubenswrapper[4822]: I1010 08:28:35.566768 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" event={"ID":"7482993c-2c6c-4ad2-9891-c6eaf07b76e3","Type":"ContainerDied","Data":"75fc00ce4807e94d4c6409563b5400c7f3764a9cf7fbb03e130aeb8ae52af991"} Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.027519 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.143904 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ceph\") pod \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.143981 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-inventory\") pod \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.144107 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n54c5\" (UniqueName: \"kubernetes.io/projected/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-kube-api-access-n54c5\") pod \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.144293 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ssh-key\") pod \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\" (UID: \"7482993c-2c6c-4ad2-9891-c6eaf07b76e3\") " Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.151937 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-kube-api-access-n54c5" (OuterVolumeSpecName: "kube-api-access-n54c5") pod "7482993c-2c6c-4ad2-9891-c6eaf07b76e3" (UID: "7482993c-2c6c-4ad2-9891-c6eaf07b76e3"). InnerVolumeSpecName "kube-api-access-n54c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.155289 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ceph" (OuterVolumeSpecName: "ceph") pod "7482993c-2c6c-4ad2-9891-c6eaf07b76e3" (UID: "7482993c-2c6c-4ad2-9891-c6eaf07b76e3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.172982 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7482993c-2c6c-4ad2-9891-c6eaf07b76e3" (UID: "7482993c-2c6c-4ad2-9891-c6eaf07b76e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.176239 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-inventory" (OuterVolumeSpecName: "inventory") pod "7482993c-2c6c-4ad2-9891-c6eaf07b76e3" (UID: "7482993c-2c6c-4ad2-9891-c6eaf07b76e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.246562 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n54c5\" (UniqueName: \"kubernetes.io/projected/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-kube-api-access-n54c5\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.246636 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.246653 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.246665 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7482993c-2c6c-4ad2-9891-c6eaf07b76e3-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.591258 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" event={"ID":"7482993c-2c6c-4ad2-9891-c6eaf07b76e3","Type":"ContainerDied","Data":"18f85dd7aa5d3fbd1546490652188552cd78414fe0be0f54cb5649cb6cdffb56"} Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.591556 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f85dd7aa5d3fbd1546490652188552cd78414fe0be0f54cb5649cb6cdffb56" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.591362 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-rsv2k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.693726 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-7bp7k"] Oct 10 08:28:37 crc kubenswrapper[4822]: E1010 08:28:37.694185 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7482993c-2c6c-4ad2-9891-c6eaf07b76e3" containerName="download-cache-openstack-openstack-cell1" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.694206 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7482993c-2c6c-4ad2-9891-c6eaf07b76e3" containerName="download-cache-openstack-openstack-cell1" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.694526 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7482993c-2c6c-4ad2-9891-c6eaf07b76e3" containerName="download-cache-openstack-openstack-cell1" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.695440 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.700256 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.700711 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.701042 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.701493 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.711912 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-7bp7k"] Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.758470 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-inventory\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.758554 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtmf\" (UniqueName: \"kubernetes.io/projected/e42ac6ea-b249-45f2-a5fa-fdb828736e26-kube-api-access-bbtmf\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.758579 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ssh-key\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.758627 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ceph\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.861060 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-inventory\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.861335 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtmf\" (UniqueName: \"kubernetes.io/projected/e42ac6ea-b249-45f2-a5fa-fdb828736e26-kube-api-access-bbtmf\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.861380 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ssh-key\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.861535 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ceph\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.866098 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ceph\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.866364 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ssh-key\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.866919 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-inventory\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:37 crc kubenswrapper[4822]: I1010 08:28:37.878094 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtmf\" (UniqueName: \"kubernetes.io/projected/e42ac6ea-b249-45f2-a5fa-fdb828736e26-kube-api-access-bbtmf\") pod \"configure-network-openstack-openstack-cell1-7bp7k\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:38 crc kubenswrapper[4822]: I1010 08:28:38.027480 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:28:38 crc kubenswrapper[4822]: I1010 08:28:38.596714 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-7bp7k"] Oct 10 08:28:39 crc kubenswrapper[4822]: I1010 08:28:39.627438 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" event={"ID":"e42ac6ea-b249-45f2-a5fa-fdb828736e26","Type":"ContainerStarted","Data":"2caa1bb06a276028e111a0c1be7a92c064f8edf497d4c637a62d3f01c709c2f7"} Oct 10 08:28:39 crc kubenswrapper[4822]: I1010 08:28:39.628206 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" event={"ID":"e42ac6ea-b249-45f2-a5fa-fdb828736e26","Type":"ContainerStarted","Data":"9ac112ed983a10a17a72f445ed64d830e38bb4009a90c4321cffa607dfc64352"} Oct 10 08:28:39 crc kubenswrapper[4822]: I1010 08:28:39.672607 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" podStartSLOduration=2.210438177 podStartE2EDuration="2.672586449s" podCreationTimestamp="2025-10-10 08:28:37 +0000 UTC" firstStartedPulling="2025-10-10 08:28:38.60418648 +0000 UTC m=+7465.699344676" lastFinishedPulling="2025-10-10 08:28:39.066334752 +0000 UTC m=+7466.161492948" observedRunningTime="2025-10-10 08:28:39.654161909 +0000 UTC m=+7466.749320125" watchObservedRunningTime="2025-10-10 08:28:39.672586449 +0000 UTC m=+7466.767744645" Oct 10 08:28:45 crc kubenswrapper[4822]: I1010 08:28:45.809142 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dpkrf"] Oct 10 08:28:45 crc kubenswrapper[4822]: I1010 08:28:45.811707 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:45 crc kubenswrapper[4822]: I1010 08:28:45.828100 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dpkrf"] Oct 10 08:28:45 crc kubenswrapper[4822]: I1010 08:28:45.947486 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-catalog-content\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:45 crc kubenswrapper[4822]: I1010 08:28:45.947635 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-utilities\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:45 crc kubenswrapper[4822]: I1010 08:28:45.947781 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stc5s\" (UniqueName: \"kubernetes.io/projected/3c96a8ba-12b9-40ab-af6d-b1782f337185-kube-api-access-stc5s\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.049814 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-utilities\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.049914 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stc5s\" (UniqueName: \"kubernetes.io/projected/3c96a8ba-12b9-40ab-af6d-b1782f337185-kube-api-access-stc5s\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.050083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-catalog-content\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.050618 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-catalog-content\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.050676 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-utilities\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.071756 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stc5s\" (UniqueName: \"kubernetes.io/projected/3c96a8ba-12b9-40ab-af6d-b1782f337185-kube-api-access-stc5s\") pod \"redhat-operators-dpkrf\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.178407 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.670369 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dpkrf"] Oct 10 08:28:46 crc kubenswrapper[4822]: I1010 08:28:46.711461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerStarted","Data":"82b6e8433f30bdbad7895ab78bcad92899a844711ad425c66003b514650887cb"} Oct 10 08:28:47 crc kubenswrapper[4822]: I1010 08:28:47.741757 4822 generic.go:334] "Generic (PLEG): container finished" podID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerID="4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166" exitCode=0 Oct 10 08:28:47 crc kubenswrapper[4822]: I1010 08:28:47.742022 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerDied","Data":"4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166"} Oct 10 08:28:48 crc kubenswrapper[4822]: I1010 08:28:48.760596 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerStarted","Data":"d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d"} Oct 10 08:28:52 crc kubenswrapper[4822]: I1010 08:28:52.814262 4822 generic.go:334] "Generic (PLEG): container finished" podID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerID="d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d" exitCode=0 Oct 10 08:28:52 crc kubenswrapper[4822]: I1010 08:28:52.814336 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerDied","Data":"d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d"} Oct 10 08:28:53 crc kubenswrapper[4822]: I1010 08:28:53.828193 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerStarted","Data":"193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10"} Oct 10 08:28:53 crc kubenswrapper[4822]: I1010 08:28:53.851792 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dpkrf" podStartSLOduration=3.376536634 podStartE2EDuration="8.851771402s" podCreationTimestamp="2025-10-10 08:28:45 +0000 UTC" firstStartedPulling="2025-10-10 08:28:47.744394172 +0000 UTC m=+7474.839552368" lastFinishedPulling="2025-10-10 08:28:53.21962894 +0000 UTC m=+7480.314787136" observedRunningTime="2025-10-10 08:28:53.844253075 +0000 UTC m=+7480.939411301" watchObservedRunningTime="2025-10-10 08:28:53.851771402 +0000 UTC m=+7480.946929618" Oct 10 08:28:56 crc kubenswrapper[4822]: I1010 08:28:56.179545 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:56 crc kubenswrapper[4822]: I1010 08:28:56.180194 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:28:57 crc kubenswrapper[4822]: I1010 08:28:57.229189 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dpkrf" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="registry-server" probeResult="failure" output=< Oct 10 08:28:57 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:28:57 crc kubenswrapper[4822]: > Oct 10 08:29:06 crc kubenswrapper[4822]: I1010 08:29:06.242775 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:29:06 crc kubenswrapper[4822]: I1010 08:29:06.303228 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:29:06 crc kubenswrapper[4822]: I1010 08:29:06.496754 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dpkrf"] Oct 10 08:29:07 crc kubenswrapper[4822]: I1010 08:29:07.975296 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dpkrf" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="registry-server" containerID="cri-o://193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10" gracePeriod=2 Oct 10 08:29:08 crc kubenswrapper[4822]: E1010 08:29:08.083189 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c96a8ba_12b9_40ab_af6d_b1782f337185.slice/crio-conmon-193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c96a8ba_12b9_40ab_af6d_b1782f337185.slice/crio-193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.520771 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.601981 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-catalog-content\") pod \"3c96a8ba-12b9-40ab-af6d-b1782f337185\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.602110 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-utilities\") pod \"3c96a8ba-12b9-40ab-af6d-b1782f337185\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.602165 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stc5s\" (UniqueName: \"kubernetes.io/projected/3c96a8ba-12b9-40ab-af6d-b1782f337185-kube-api-access-stc5s\") pod \"3c96a8ba-12b9-40ab-af6d-b1782f337185\" (UID: \"3c96a8ba-12b9-40ab-af6d-b1782f337185\") " Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.603330 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-utilities" (OuterVolumeSpecName: "utilities") pod "3c96a8ba-12b9-40ab-af6d-b1782f337185" (UID: "3c96a8ba-12b9-40ab-af6d-b1782f337185"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.608754 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c96a8ba-12b9-40ab-af6d-b1782f337185-kube-api-access-stc5s" (OuterVolumeSpecName: "kube-api-access-stc5s") pod "3c96a8ba-12b9-40ab-af6d-b1782f337185" (UID: "3c96a8ba-12b9-40ab-af6d-b1782f337185"). InnerVolumeSpecName "kube-api-access-stc5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.683784 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c96a8ba-12b9-40ab-af6d-b1782f337185" (UID: "3c96a8ba-12b9-40ab-af6d-b1782f337185"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.704982 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.705020 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c96a8ba-12b9-40ab-af6d-b1782f337185-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.705033 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stc5s\" (UniqueName: \"kubernetes.io/projected/3c96a8ba-12b9-40ab-af6d-b1782f337185-kube-api-access-stc5s\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.991041 4822 generic.go:334] "Generic (PLEG): container finished" podID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerID="193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10" exitCode=0 Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.991100 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerDied","Data":"193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10"} Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.991141 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpkrf" event={"ID":"3c96a8ba-12b9-40ab-af6d-b1782f337185","Type":"ContainerDied","Data":"82b6e8433f30bdbad7895ab78bcad92899a844711ad425c66003b514650887cb"} Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.991163 4822 scope.go:117] "RemoveContainer" containerID="193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10" Oct 10 08:29:08 crc kubenswrapper[4822]: I1010 08:29:08.991382 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpkrf" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.032271 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dpkrf"] Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.033434 4822 scope.go:117] "RemoveContainer" containerID="d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.044530 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dpkrf"] Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.057627 4822 scope.go:117] "RemoveContainer" containerID="4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.114031 4822 scope.go:117] "RemoveContainer" containerID="193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10" Oct 10 08:29:09 crc kubenswrapper[4822]: E1010 08:29:09.114588 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10\": container with ID starting with 193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10 not found: ID does not exist" containerID="193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.114626 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10"} err="failed to get container status \"193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10\": rpc error: code = NotFound desc = could not find container \"193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10\": container with ID starting with 193487a0afd8c7e1c86ca4cd7beee8613d581fe82701125452799daa34374c10 not found: ID does not exist" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.114655 4822 scope.go:117] "RemoveContainer" containerID="d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d" Oct 10 08:29:09 crc kubenswrapper[4822]: E1010 08:29:09.115120 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d\": container with ID starting with d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d not found: ID does not exist" containerID="d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.115241 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d"} err="failed to get container status \"d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d\": rpc error: code = NotFound desc = could not find container \"d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d\": container with ID starting with d7c99ba56a35434e8095aabe3abff50b351b6c509b0d22dbb08b87d0c03be22d not found: ID does not exist" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.115333 4822 scope.go:117] "RemoveContainer" containerID="4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166" Oct 10 08:29:09 crc kubenswrapper[4822]: E1010 08:29:09.115934 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166\": container with ID starting with 4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166 not found: ID does not exist" containerID="4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.115995 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166"} err="failed to get container status \"4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166\": rpc error: code = NotFound desc = could not find container \"4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166\": container with ID starting with 4b41311dd9cafe45ababf6099165bdc427b457a55e9508fdec1b2145abea9166 not found: ID does not exist" Oct 10 08:29:09 crc kubenswrapper[4822]: I1010 08:29:09.662699 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" path="/var/lib/kubelet/pods/3c96a8ba-12b9-40ab-af6d-b1782f337185/volumes" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.904369 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-65dsl"] Oct 10 08:29:11 crc kubenswrapper[4822]: E1010 08:29:11.905362 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="registry-server" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.905382 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="registry-server" Oct 10 08:29:11 crc kubenswrapper[4822]: E1010 08:29:11.905417 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="extract-content" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.905426 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="extract-content" Oct 10 08:29:11 crc kubenswrapper[4822]: E1010 08:29:11.905451 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="extract-utilities" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.905462 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="extract-utilities" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.905736 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c96a8ba-12b9-40ab-af6d-b1782f337185" containerName="registry-server" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.907866 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.913973 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65dsl"] Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.981645 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-utilities\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.982097 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkl7\" (UniqueName: \"kubernetes.io/projected/55ef3ae0-b4de-4980-977f-d75097cf1d9b-kube-api-access-5pkl7\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:11 crc kubenswrapper[4822]: I1010 08:29:11.982716 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-catalog-content\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.084718 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-catalog-content\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.084771 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-utilities\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.084888 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkl7\" (UniqueName: \"kubernetes.io/projected/55ef3ae0-b4de-4980-977f-d75097cf1d9b-kube-api-access-5pkl7\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.085659 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-utilities\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.085866 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-catalog-content\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.108203 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkl7\" (UniqueName: \"kubernetes.io/projected/55ef3ae0-b4de-4980-977f-d75097cf1d9b-kube-api-access-5pkl7\") pod \"certified-operators-65dsl\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.235244 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:12 crc kubenswrapper[4822]: I1010 08:29:12.589095 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65dsl"] Oct 10 08:29:13 crc kubenswrapper[4822]: I1010 08:29:13.038321 4822 generic.go:334] "Generic (PLEG): container finished" podID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerID="a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071" exitCode=0 Oct 10 08:29:13 crc kubenswrapper[4822]: I1010 08:29:13.038385 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerDied","Data":"a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071"} Oct 10 08:29:13 crc kubenswrapper[4822]: I1010 08:29:13.038422 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerStarted","Data":"a8665c6b2efc2678caa4455ea8bb25db780ed54081da8fafc0f9dbb7d8001f3b"} Oct 10 08:29:14 crc kubenswrapper[4822]: I1010 08:29:14.051049 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerStarted","Data":"7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc"} Oct 10 08:29:15 crc kubenswrapper[4822]: I1010 08:29:15.064220 4822 generic.go:334] "Generic (PLEG): container finished" podID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerID="7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc" exitCode=0 Oct 10 08:29:15 crc kubenswrapper[4822]: I1010 08:29:15.064324 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerDied","Data":"7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc"} Oct 10 08:29:16 crc kubenswrapper[4822]: I1010 08:29:16.076225 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerStarted","Data":"c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6"} Oct 10 08:29:16 crc kubenswrapper[4822]: I1010 08:29:16.097375 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-65dsl" podStartSLOduration=2.623555595 podStartE2EDuration="5.097360117s" podCreationTimestamp="2025-10-10 08:29:11 +0000 UTC" firstStartedPulling="2025-10-10 08:29:13.042568015 +0000 UTC m=+7500.137726261" lastFinishedPulling="2025-10-10 08:29:15.516372587 +0000 UTC m=+7502.611530783" observedRunningTime="2025-10-10 08:29:16.094492654 +0000 UTC m=+7503.189650850" watchObservedRunningTime="2025-10-10 08:29:16.097360117 +0000 UTC m=+7503.192518313" Oct 10 08:29:22 crc kubenswrapper[4822]: I1010 08:29:22.236438 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:22 crc kubenswrapper[4822]: I1010 08:29:22.236973 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:22 crc kubenswrapper[4822]: I1010 08:29:22.294638 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:23 crc kubenswrapper[4822]: I1010 08:29:23.202198 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:23 crc kubenswrapper[4822]: I1010 08:29:23.262352 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65dsl"] Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.175541 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-65dsl" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="registry-server" containerID="cri-o://c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6" gracePeriod=2 Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.651419 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.786768 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-catalog-content\") pod \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.786939 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-utilities\") pod \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.786977 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pkl7\" (UniqueName: \"kubernetes.io/projected/55ef3ae0-b4de-4980-977f-d75097cf1d9b-kube-api-access-5pkl7\") pod \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\" (UID: \"55ef3ae0-b4de-4980-977f-d75097cf1d9b\") " Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.789158 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-utilities" (OuterVolumeSpecName: "utilities") pod "55ef3ae0-b4de-4980-977f-d75097cf1d9b" (UID: "55ef3ae0-b4de-4980-977f-d75097cf1d9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.792747 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ef3ae0-b4de-4980-977f-d75097cf1d9b-kube-api-access-5pkl7" (OuterVolumeSpecName: "kube-api-access-5pkl7") pod "55ef3ae0-b4de-4980-977f-d75097cf1d9b" (UID: "55ef3ae0-b4de-4980-977f-d75097cf1d9b"). InnerVolumeSpecName "kube-api-access-5pkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.853150 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55ef3ae0-b4de-4980-977f-d75097cf1d9b" (UID: "55ef3ae0-b4de-4980-977f-d75097cf1d9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.889498 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.889535 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef3ae0-b4de-4980-977f-d75097cf1d9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:25 crc kubenswrapper[4822]: I1010 08:29:25.889546 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pkl7\" (UniqueName: \"kubernetes.io/projected/55ef3ae0-b4de-4980-977f-d75097cf1d9b-kube-api-access-5pkl7\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.189964 4822 generic.go:334] "Generic (PLEG): container finished" podID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerID="c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6" exitCode=0 Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.190025 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerDied","Data":"c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6"} Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.190062 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65dsl" event={"ID":"55ef3ae0-b4de-4980-977f-d75097cf1d9b","Type":"ContainerDied","Data":"a8665c6b2efc2678caa4455ea8bb25db780ed54081da8fafc0f9dbb7d8001f3b"} Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.190083 4822 scope.go:117] "RemoveContainer" containerID="c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.190963 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65dsl" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.214598 4822 scope.go:117] "RemoveContainer" containerID="7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.249545 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65dsl"] Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.261367 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-65dsl"] Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.273973 4822 scope.go:117] "RemoveContainer" containerID="a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.302434 4822 scope.go:117] "RemoveContainer" containerID="c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6" Oct 10 08:29:26 crc kubenswrapper[4822]: E1010 08:29:26.305863 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6\": container with ID starting with c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6 not found: ID does not exist" containerID="c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.305909 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6"} err="failed to get container status \"c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6\": rpc error: code = NotFound desc = could not find container \"c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6\": container with ID starting with c60093599dfb8b2fb4faa059acf5834b481d9f46503c4086e9eea5617f725ee6 not found: ID does not exist" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.305936 4822 scope.go:117] "RemoveContainer" containerID="7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc" Oct 10 08:29:26 crc kubenswrapper[4822]: E1010 08:29:26.306250 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc\": container with ID starting with 7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc not found: ID does not exist" containerID="7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.306320 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc"} err="failed to get container status \"7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc\": rpc error: code = NotFound desc = could not find container \"7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc\": container with ID starting with 7a6c5cb19bae88e93ff960f141862b112d97c00fbe3b3b9c3fe2fe9bf8cf95cc not found: ID does not exist" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.306355 4822 scope.go:117] "RemoveContainer" containerID="a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071" Oct 10 08:29:26 crc kubenswrapper[4822]: E1010 08:29:26.306731 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071\": container with ID starting with a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071 not found: ID does not exist" containerID="a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071" Oct 10 08:29:26 crc kubenswrapper[4822]: I1010 08:29:26.306769 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071"} err="failed to get container status \"a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071\": rpc error: code = NotFound desc = could not find container \"a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071\": container with ID starting with a4f5930835e3bff01cede26c888f5ef8d49d3c99f00f616ccdde58903a8f1071 not found: ID does not exist" Oct 10 08:29:27 crc kubenswrapper[4822]: I1010 08:29:27.683764 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" path="/var/lib/kubelet/pods/55ef3ae0-b4de-4980-977f-d75097cf1d9b/volumes" Oct 10 08:29:56 crc kubenswrapper[4822]: I1010 08:29:56.570654 4822 generic.go:334] "Generic (PLEG): container finished" podID="e42ac6ea-b249-45f2-a5fa-fdb828736e26" containerID="2caa1bb06a276028e111a0c1be7a92c064f8edf497d4c637a62d3f01c709c2f7" exitCode=0 Oct 10 08:29:56 crc kubenswrapper[4822]: I1010 08:29:56.570757 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" event={"ID":"e42ac6ea-b249-45f2-a5fa-fdb828736e26","Type":"ContainerDied","Data":"2caa1bb06a276028e111a0c1be7a92c064f8edf497d4c637a62d3f01c709c2f7"} Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.045508 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.227903 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ssh-key\") pod \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.227984 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-inventory\") pod \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.228309 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ceph\") pod \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.228791 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbtmf\" (UniqueName: \"kubernetes.io/projected/e42ac6ea-b249-45f2-a5fa-fdb828736e26-kube-api-access-bbtmf\") pod \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\" (UID: \"e42ac6ea-b249-45f2-a5fa-fdb828736e26\") " Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.235201 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ceph" (OuterVolumeSpecName: "ceph") pod "e42ac6ea-b249-45f2-a5fa-fdb828736e26" (UID: "e42ac6ea-b249-45f2-a5fa-fdb828736e26"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.236787 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42ac6ea-b249-45f2-a5fa-fdb828736e26-kube-api-access-bbtmf" (OuterVolumeSpecName: "kube-api-access-bbtmf") pod "e42ac6ea-b249-45f2-a5fa-fdb828736e26" (UID: "e42ac6ea-b249-45f2-a5fa-fdb828736e26"). InnerVolumeSpecName "kube-api-access-bbtmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.280883 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-inventory" (OuterVolumeSpecName: "inventory") pod "e42ac6ea-b249-45f2-a5fa-fdb828736e26" (UID: "e42ac6ea-b249-45f2-a5fa-fdb828736e26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.281105 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e42ac6ea-b249-45f2-a5fa-fdb828736e26" (UID: "e42ac6ea-b249-45f2-a5fa-fdb828736e26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.333924 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.334162 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.334181 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e42ac6ea-b249-45f2-a5fa-fdb828736e26-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.334201 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbtmf\" (UniqueName: \"kubernetes.io/projected/e42ac6ea-b249-45f2-a5fa-fdb828736e26-kube-api-access-bbtmf\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.590920 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" event={"ID":"e42ac6ea-b249-45f2-a5fa-fdb828736e26","Type":"ContainerDied","Data":"9ac112ed983a10a17a72f445ed64d830e38bb4009a90c4321cffa607dfc64352"} Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.591024 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac112ed983a10a17a72f445ed64d830e38bb4009a90c4321cffa607dfc64352" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.590977 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-7bp7k" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.693415 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-fl6w8"] Oct 10 08:29:58 crc kubenswrapper[4822]: E1010 08:29:58.693999 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="extract-content" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.694023 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="extract-content" Oct 10 08:29:58 crc kubenswrapper[4822]: E1010 08:29:58.694039 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="registry-server" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.694048 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="registry-server" Oct 10 08:29:58 crc kubenswrapper[4822]: E1010 08:29:58.694069 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="extract-utilities" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.694077 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="extract-utilities" Oct 10 08:29:58 crc kubenswrapper[4822]: E1010 08:29:58.694092 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42ac6ea-b249-45f2-a5fa-fdb828736e26" containerName="configure-network-openstack-openstack-cell1" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.694102 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42ac6ea-b249-45f2-a5fa-fdb828736e26" containerName="configure-network-openstack-openstack-cell1" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.694352 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ef3ae0-b4de-4980-977f-d75097cf1d9b" containerName="registry-server" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.694397 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42ac6ea-b249-45f2-a5fa-fdb828736e26" containerName="configure-network-openstack-openstack-cell1" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.695368 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.697878 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.698402 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.698644 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.698835 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.704555 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-fl6w8"] Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.844610 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ceph\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.844900 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd46q\" (UniqueName: \"kubernetes.io/projected/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-kube-api-access-xd46q\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.844976 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ssh-key\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.845087 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-inventory\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.947922 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ceph\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.948257 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd46q\" (UniqueName: \"kubernetes.io/projected/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-kube-api-access-xd46q\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.948407 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ssh-key\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.948494 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-inventory\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.952903 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ssh-key\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.953070 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ceph\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.953408 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-inventory\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:58 crc kubenswrapper[4822]: I1010 08:29:58.981178 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd46q\" (UniqueName: \"kubernetes.io/projected/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-kube-api-access-xd46q\") pod \"validate-network-openstack-openstack-cell1-fl6w8\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:59 crc kubenswrapper[4822]: I1010 08:29:59.021515 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:29:59 crc kubenswrapper[4822]: I1010 08:29:59.577445 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-fl6w8"] Oct 10 08:29:59 crc kubenswrapper[4822]: I1010 08:29:59.590711 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:29:59 crc kubenswrapper[4822]: I1010 08:29:59.613318 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" event={"ID":"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b","Type":"ContainerStarted","Data":"19a7ebdae0833c8af28721d5d869f131f81590664e1b349cecae91f603a75bcd"} Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.179969 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4"] Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.184164 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.186632 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.187015 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.219326 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4"] Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.280386 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa2ec76-1a60-4849-98cc-c30e58af2078-config-volume\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.280783 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9p5\" (UniqueName: \"kubernetes.io/projected/2aa2ec76-1a60-4849-98cc-c30e58af2078-kube-api-access-lv9p5\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.281349 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa2ec76-1a60-4849-98cc-c30e58af2078-secret-volume\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.383827 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa2ec76-1a60-4849-98cc-c30e58af2078-secret-volume\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.384504 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa2ec76-1a60-4849-98cc-c30e58af2078-config-volume\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.384597 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9p5\" (UniqueName: \"kubernetes.io/projected/2aa2ec76-1a60-4849-98cc-c30e58af2078-kube-api-access-lv9p5\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.385724 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa2ec76-1a60-4849-98cc-c30e58af2078-config-volume\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.388248 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa2ec76-1a60-4849-98cc-c30e58af2078-secret-volume\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.401920 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9p5\" (UniqueName: \"kubernetes.io/projected/2aa2ec76-1a60-4849-98cc-c30e58af2078-kube-api-access-lv9p5\") pod \"collect-profiles-29334750-72tc4\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.515672 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.636428 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" event={"ID":"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b","Type":"ContainerStarted","Data":"258037862b92a25c95b7385341392e916fbac6bfb63e85faa5b224181892ee2c"} Oct 10 08:30:00 crc kubenswrapper[4822]: I1010 08:30:00.662793 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" podStartSLOduration=1.957311603 podStartE2EDuration="2.662770364s" podCreationTimestamp="2025-10-10 08:29:58 +0000 UTC" firstStartedPulling="2025-10-10 08:29:59.590496293 +0000 UTC m=+7546.685654489" lastFinishedPulling="2025-10-10 08:30:00.295955054 +0000 UTC m=+7547.391113250" observedRunningTime="2025-10-10 08:30:00.657636786 +0000 UTC m=+7547.752794992" watchObservedRunningTime="2025-10-10 08:30:00.662770364 +0000 UTC m=+7547.757928550" Oct 10 08:30:01 crc kubenswrapper[4822]: I1010 08:30:01.003383 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4"] Oct 10 08:30:01 crc kubenswrapper[4822]: W1010 08:30:01.015637 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa2ec76_1a60_4849_98cc_c30e58af2078.slice/crio-dc0d23694a0f16ee4ecb119e72a5990361aafd300ed80e1d81ee6df11334466a WatchSource:0}: Error finding container dc0d23694a0f16ee4ecb119e72a5990361aafd300ed80e1d81ee6df11334466a: Status 404 returned error can't find the container with id dc0d23694a0f16ee4ecb119e72a5990361aafd300ed80e1d81ee6df11334466a Oct 10 08:30:01 crc kubenswrapper[4822]: I1010 08:30:01.337298 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:30:01 crc kubenswrapper[4822]: I1010 08:30:01.337977 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:30:01 crc kubenswrapper[4822]: I1010 08:30:01.647796 4822 generic.go:334] "Generic (PLEG): container finished" podID="2aa2ec76-1a60-4849-98cc-c30e58af2078" containerID="aed808d0613828ad03d66294de6ebd44b089dcdd6c0b1da06589d2c44116e1e1" exitCode=0 Oct 10 08:30:01 crc kubenswrapper[4822]: I1010 08:30:01.647911 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" event={"ID":"2aa2ec76-1a60-4849-98cc-c30e58af2078","Type":"ContainerDied","Data":"aed808d0613828ad03d66294de6ebd44b089dcdd6c0b1da06589d2c44116e1e1"} Oct 10 08:30:01 crc kubenswrapper[4822]: I1010 08:30:01.647959 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" event={"ID":"2aa2ec76-1a60-4849-98cc-c30e58af2078","Type":"ContainerStarted","Data":"dc0d23694a0f16ee4ecb119e72a5990361aafd300ed80e1d81ee6df11334466a"} Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.068515 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.157276 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa2ec76-1a60-4849-98cc-c30e58af2078-config-volume\") pod \"2aa2ec76-1a60-4849-98cc-c30e58af2078\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.157537 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa2ec76-1a60-4849-98cc-c30e58af2078-secret-volume\") pod \"2aa2ec76-1a60-4849-98cc-c30e58af2078\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.157893 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv9p5\" (UniqueName: \"kubernetes.io/projected/2aa2ec76-1a60-4849-98cc-c30e58af2078-kube-api-access-lv9p5\") pod \"2aa2ec76-1a60-4849-98cc-c30e58af2078\" (UID: \"2aa2ec76-1a60-4849-98cc-c30e58af2078\") " Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.158123 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa2ec76-1a60-4849-98cc-c30e58af2078-config-volume" (OuterVolumeSpecName: "config-volume") pod "2aa2ec76-1a60-4849-98cc-c30e58af2078" (UID: "2aa2ec76-1a60-4849-98cc-c30e58af2078"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.158620 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa2ec76-1a60-4849-98cc-c30e58af2078-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.168116 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa2ec76-1a60-4849-98cc-c30e58af2078-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2aa2ec76-1a60-4849-98cc-c30e58af2078" (UID: "2aa2ec76-1a60-4849-98cc-c30e58af2078"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.168371 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa2ec76-1a60-4849-98cc-c30e58af2078-kube-api-access-lv9p5" (OuterVolumeSpecName: "kube-api-access-lv9p5") pod "2aa2ec76-1a60-4849-98cc-c30e58af2078" (UID: "2aa2ec76-1a60-4849-98cc-c30e58af2078"). InnerVolumeSpecName "kube-api-access-lv9p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.260889 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa2ec76-1a60-4849-98cc-c30e58af2078-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.260931 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv9p5\" (UniqueName: \"kubernetes.io/projected/2aa2ec76-1a60-4849-98cc-c30e58af2078-kube-api-access-lv9p5\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.671948 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" event={"ID":"2aa2ec76-1a60-4849-98cc-c30e58af2078","Type":"ContainerDied","Data":"dc0d23694a0f16ee4ecb119e72a5990361aafd300ed80e1d81ee6df11334466a"} Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.671993 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0d23694a0f16ee4ecb119e72a5990361aafd300ed80e1d81ee6df11334466a" Oct 10 08:30:03 crc kubenswrapper[4822]: I1010 08:30:03.672450 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4" Oct 10 08:30:04 crc kubenswrapper[4822]: I1010 08:30:04.150702 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f"] Oct 10 08:30:04 crc kubenswrapper[4822]: I1010 08:30:04.159407 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-fj56f"] Oct 10 08:30:05 crc kubenswrapper[4822]: I1010 08:30:05.670123 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e67f756-5528-4a8d-99f3-bec56fefc38f" path="/var/lib/kubelet/pods/0e67f756-5528-4a8d-99f3-bec56fefc38f/volumes" Oct 10 08:30:05 crc kubenswrapper[4822]: I1010 08:30:05.710340 4822 generic.go:334] "Generic (PLEG): container finished" podID="f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" containerID="258037862b92a25c95b7385341392e916fbac6bfb63e85faa5b224181892ee2c" exitCode=0 Oct 10 08:30:05 crc kubenswrapper[4822]: I1010 08:30:05.710398 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" event={"ID":"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b","Type":"ContainerDied","Data":"258037862b92a25c95b7385341392e916fbac6bfb63e85faa5b224181892ee2c"} Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.245275 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.362595 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd46q\" (UniqueName: \"kubernetes.io/projected/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-kube-api-access-xd46q\") pod \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.362961 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-inventory\") pod \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.363035 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ceph\") pod \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.363084 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ssh-key\") pod \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\" (UID: \"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b\") " Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.369252 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-kube-api-access-xd46q" (OuterVolumeSpecName: "kube-api-access-xd46q") pod "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" (UID: "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b"). InnerVolumeSpecName "kube-api-access-xd46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.382656 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ceph" (OuterVolumeSpecName: "ceph") pod "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" (UID: "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.395567 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-inventory" (OuterVolumeSpecName: "inventory") pod "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" (UID: "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.398437 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" (UID: "f4b5919e-eeeb-4649-9ac9-6c18676b2a5b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.466364 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd46q\" (UniqueName: \"kubernetes.io/projected/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-kube-api-access-xd46q\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.466459 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.466492 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.466517 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4b5919e-eeeb-4649-9ac9-6c18676b2a5b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.730341 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" event={"ID":"f4b5919e-eeeb-4649-9ac9-6c18676b2a5b","Type":"ContainerDied","Data":"19a7ebdae0833c8af28721d5d869f131f81590664e1b349cecae91f603a75bcd"} Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.730371 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-fl6w8" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.730422 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19a7ebdae0833c8af28721d5d869f131f81590664e1b349cecae91f603a75bcd" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.807820 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-jbpgq"] Oct 10 08:30:07 crc kubenswrapper[4822]: E1010 08:30:07.812259 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" containerName="validate-network-openstack-openstack-cell1" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.812295 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" containerName="validate-network-openstack-openstack-cell1" Oct 10 08:30:07 crc kubenswrapper[4822]: E1010 08:30:07.812333 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa2ec76-1a60-4849-98cc-c30e58af2078" containerName="collect-profiles" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.812344 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa2ec76-1a60-4849-98cc-c30e58af2078" containerName="collect-profiles" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.812715 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa2ec76-1a60-4849-98cc-c30e58af2078" containerName="collect-profiles" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.812755 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b5919e-eeeb-4649-9ac9-6c18676b2a5b" containerName="validate-network-openstack-openstack-cell1" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.813827 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.815725 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.815892 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.815978 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.819984 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.822019 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-jbpgq"] Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.980001 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ceph\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.980158 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ssh-key\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.980210 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58w4r\" (UniqueName: \"kubernetes.io/projected/3ff8c46f-3adf-4c25-8ab2-de6920f49542-kube-api-access-58w4r\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:07 crc kubenswrapper[4822]: I1010 08:30:07.980388 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-inventory\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.082521 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-inventory\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.082891 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ceph\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.083177 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ssh-key\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.083320 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58w4r\" (UniqueName: \"kubernetes.io/projected/3ff8c46f-3adf-4c25-8ab2-de6920f49542-kube-api-access-58w4r\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.087828 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-inventory\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.087881 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ceph\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.091215 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ssh-key\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.101174 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58w4r\" (UniqueName: \"kubernetes.io/projected/3ff8c46f-3adf-4c25-8ab2-de6920f49542-kube-api-access-58w4r\") pod \"install-os-openstack-openstack-cell1-jbpgq\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.136716 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.696751 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-jbpgq"] Oct 10 08:30:08 crc kubenswrapper[4822]: I1010 08:30:08.741512 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" event={"ID":"3ff8c46f-3adf-4c25-8ab2-de6920f49542","Type":"ContainerStarted","Data":"6cceca8034664e30858ace0d680effe7aa2f8bdab18e13bcce26de58fdb754b1"} Oct 10 08:30:09 crc kubenswrapper[4822]: I1010 08:30:09.772909 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" event={"ID":"3ff8c46f-3adf-4c25-8ab2-de6920f49542","Type":"ContainerStarted","Data":"f54f4ee62784b1f4ab2dbd0bef80816117a2a4708a2f2605555df1fad561469e"} Oct 10 08:30:09 crc kubenswrapper[4822]: I1010 08:30:09.800594 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" podStartSLOduration=2.418554307 podStartE2EDuration="2.800573845s" podCreationTimestamp="2025-10-10 08:30:07 +0000 UTC" firstStartedPulling="2025-10-10 08:30:08.700098383 +0000 UTC m=+7555.795256569" lastFinishedPulling="2025-10-10 08:30:09.082117911 +0000 UTC m=+7556.177276107" observedRunningTime="2025-10-10 08:30:09.791313829 +0000 UTC m=+7556.886472045" watchObservedRunningTime="2025-10-10 08:30:09.800573845 +0000 UTC m=+7556.895732051" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.293349 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pgn6p"] Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.297312 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.313316 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgn6p"] Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.451085 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6g8v\" (UniqueName: \"kubernetes.io/projected/42b05725-5db6-4e68-81e9-c44ffd0aa486-kube-api-access-l6g8v\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.451177 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-utilities\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.451279 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-catalog-content\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.553519 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6g8v\" (UniqueName: \"kubernetes.io/projected/42b05725-5db6-4e68-81e9-c44ffd0aa486-kube-api-access-l6g8v\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.553655 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-utilities\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.553704 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-catalog-content\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.554337 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-utilities\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.554471 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-catalog-content\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.582376 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6g8v\" (UniqueName: \"kubernetes.io/projected/42b05725-5db6-4e68-81e9-c44ffd0aa486-kube-api-access-l6g8v\") pod \"community-operators-pgn6p\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:23 crc kubenswrapper[4822]: I1010 08:30:23.622086 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:24 crc kubenswrapper[4822]: I1010 08:30:24.191906 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgn6p"] Oct 10 08:30:24 crc kubenswrapper[4822]: I1010 08:30:24.942249 4822 generic.go:334] "Generic (PLEG): container finished" podID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerID="2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01" exitCode=0 Oct 10 08:30:24 crc kubenswrapper[4822]: I1010 08:30:24.942321 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerDied","Data":"2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01"} Oct 10 08:30:24 crc kubenswrapper[4822]: I1010 08:30:24.942581 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerStarted","Data":"3a51996dc3333f0491d766865293bfb86feee0626ef0d2d8b8b794e204a86283"} Oct 10 08:30:26 crc kubenswrapper[4822]: I1010 08:30:26.977031 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerStarted","Data":"53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132"} Oct 10 08:30:27 crc kubenswrapper[4822]: I1010 08:30:27.990246 4822 generic.go:334] "Generic (PLEG): container finished" podID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerID="53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132" exitCode=0 Oct 10 08:30:27 crc kubenswrapper[4822]: I1010 08:30:27.990302 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerDied","Data":"53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132"} Oct 10 08:30:29 crc kubenswrapper[4822]: I1010 08:30:29.007650 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerStarted","Data":"5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a"} Oct 10 08:30:29 crc kubenswrapper[4822]: I1010 08:30:29.034709 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pgn6p" podStartSLOduration=2.320523298 podStartE2EDuration="6.034681945s" podCreationTimestamp="2025-10-10 08:30:23 +0000 UTC" firstStartedPulling="2025-10-10 08:30:24.947994503 +0000 UTC m=+7572.043152709" lastFinishedPulling="2025-10-10 08:30:28.66215315 +0000 UTC m=+7575.757311356" observedRunningTime="2025-10-10 08:30:29.026266363 +0000 UTC m=+7576.121424569" watchObservedRunningTime="2025-10-10 08:30:29.034681945 +0000 UTC m=+7576.129840161" Oct 10 08:30:31 crc kubenswrapper[4822]: I1010 08:30:31.337089 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:30:31 crc kubenswrapper[4822]: I1010 08:30:31.337386 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:30:33 crc kubenswrapper[4822]: I1010 08:30:33.623117 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:33 crc kubenswrapper[4822]: I1010 08:30:33.623980 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:33 crc kubenswrapper[4822]: I1010 08:30:33.689517 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:34 crc kubenswrapper[4822]: I1010 08:30:34.140927 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:34 crc kubenswrapper[4822]: I1010 08:30:34.201177 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgn6p"] Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.085662 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pgn6p" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="registry-server" containerID="cri-o://5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a" gracePeriod=2 Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.631265 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.812051 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-utilities\") pod \"42b05725-5db6-4e68-81e9-c44ffd0aa486\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.812117 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-catalog-content\") pod \"42b05725-5db6-4e68-81e9-c44ffd0aa486\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.812463 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6g8v\" (UniqueName: \"kubernetes.io/projected/42b05725-5db6-4e68-81e9-c44ffd0aa486-kube-api-access-l6g8v\") pod \"42b05725-5db6-4e68-81e9-c44ffd0aa486\" (UID: \"42b05725-5db6-4e68-81e9-c44ffd0aa486\") " Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.813291 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-utilities" (OuterVolumeSpecName: "utilities") pod "42b05725-5db6-4e68-81e9-c44ffd0aa486" (UID: "42b05725-5db6-4e68-81e9-c44ffd0aa486"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.813496 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.819855 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b05725-5db6-4e68-81e9-c44ffd0aa486-kube-api-access-l6g8v" (OuterVolumeSpecName: "kube-api-access-l6g8v") pod "42b05725-5db6-4e68-81e9-c44ffd0aa486" (UID: "42b05725-5db6-4e68-81e9-c44ffd0aa486"). InnerVolumeSpecName "kube-api-access-l6g8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:36 crc kubenswrapper[4822]: I1010 08:30:36.914931 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6g8v\" (UniqueName: \"kubernetes.io/projected/42b05725-5db6-4e68-81e9-c44ffd0aa486-kube-api-access-l6g8v\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.097738 4822 generic.go:334] "Generic (PLEG): container finished" podID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerID="5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a" exitCode=0 Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.097783 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgn6p" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.097820 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerDied","Data":"5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a"} Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.097904 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgn6p" event={"ID":"42b05725-5db6-4e68-81e9-c44ffd0aa486","Type":"ContainerDied","Data":"3a51996dc3333f0491d766865293bfb86feee0626ef0d2d8b8b794e204a86283"} Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.097938 4822 scope.go:117] "RemoveContainer" containerID="5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.124139 4822 scope.go:117] "RemoveContainer" containerID="53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.161431 4822 scope.go:117] "RemoveContainer" containerID="2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.208898 4822 scope.go:117] "RemoveContainer" containerID="5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a" Oct 10 08:30:37 crc kubenswrapper[4822]: E1010 08:30:37.224425 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a\": container with ID starting with 5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a not found: ID does not exist" containerID="5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.224880 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a"} err="failed to get container status \"5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a\": rpc error: code = NotFound desc = could not find container \"5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a\": container with ID starting with 5193bd712fdcc2713bc90c0316c87c7bf3641603b41deea1099e57f4f21b4e1a not found: ID does not exist" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.224939 4822 scope.go:117] "RemoveContainer" containerID="53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132" Oct 10 08:30:37 crc kubenswrapper[4822]: E1010 08:30:37.226552 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132\": container with ID starting with 53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132 not found: ID does not exist" containerID="53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.226589 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132"} err="failed to get container status \"53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132\": rpc error: code = NotFound desc = could not find container \"53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132\": container with ID starting with 53ce450b4af71180e92e4223e20fbc6d7a53476e3936531e4f4268809d7fb132 not found: ID does not exist" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.226609 4822 scope.go:117] "RemoveContainer" containerID="2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01" Oct 10 08:30:37 crc kubenswrapper[4822]: E1010 08:30:37.227976 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01\": container with ID starting with 2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01 not found: ID does not exist" containerID="2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.228008 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01"} err="failed to get container status \"2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01\": rpc error: code = NotFound desc = could not find container \"2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01\": container with ID starting with 2e2b5c9a223c2191720fa76b0f9a9357e0ba1bd9de46ad71171178c1388b0d01 not found: ID does not exist" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.231101 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42b05725-5db6-4e68-81e9-c44ffd0aa486" (UID: "42b05725-5db6-4e68-81e9-c44ffd0aa486"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.326260 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b05725-5db6-4e68-81e9-c44ffd0aa486-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.451319 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgn6p"] Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.463106 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pgn6p"] Oct 10 08:30:37 crc kubenswrapper[4822]: I1010 08:30:37.670353 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" path="/var/lib/kubelet/pods/42b05725-5db6-4e68-81e9-c44ffd0aa486/volumes" Oct 10 08:30:38 crc kubenswrapper[4822]: I1010 08:30:38.567983 4822 scope.go:117] "RemoveContainer" containerID="303e38474fea639c35df7651edc3d1ecd052c0b996fe65235b15aa4722f1ec97" Oct 10 08:30:54 crc kubenswrapper[4822]: I1010 08:30:54.285909 4822 generic.go:334] "Generic (PLEG): container finished" podID="3ff8c46f-3adf-4c25-8ab2-de6920f49542" containerID="f54f4ee62784b1f4ab2dbd0bef80816117a2a4708a2f2605555df1fad561469e" exitCode=0 Oct 10 08:30:54 crc kubenswrapper[4822]: I1010 08:30:54.285970 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" event={"ID":"3ff8c46f-3adf-4c25-8ab2-de6920f49542","Type":"ContainerDied","Data":"f54f4ee62784b1f4ab2dbd0bef80816117a2a4708a2f2605555df1fad561469e"} Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.782797 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.957605 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ssh-key\") pod \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.957689 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-inventory\") pod \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.957950 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ceph\") pod \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.958059 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58w4r\" (UniqueName: \"kubernetes.io/projected/3ff8c46f-3adf-4c25-8ab2-de6920f49542-kube-api-access-58w4r\") pod \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\" (UID: \"3ff8c46f-3adf-4c25-8ab2-de6920f49542\") " Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.964534 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ceph" (OuterVolumeSpecName: "ceph") pod "3ff8c46f-3adf-4c25-8ab2-de6920f49542" (UID: "3ff8c46f-3adf-4c25-8ab2-de6920f49542"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.965239 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff8c46f-3adf-4c25-8ab2-de6920f49542-kube-api-access-58w4r" (OuterVolumeSpecName: "kube-api-access-58w4r") pod "3ff8c46f-3adf-4c25-8ab2-de6920f49542" (UID: "3ff8c46f-3adf-4c25-8ab2-de6920f49542"). InnerVolumeSpecName "kube-api-access-58w4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:55 crc kubenswrapper[4822]: I1010 08:30:55.994989 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-inventory" (OuterVolumeSpecName: "inventory") pod "3ff8c46f-3adf-4c25-8ab2-de6920f49542" (UID: "3ff8c46f-3adf-4c25-8ab2-de6920f49542"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.006461 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ff8c46f-3adf-4c25-8ab2-de6920f49542" (UID: "3ff8c46f-3adf-4c25-8ab2-de6920f49542"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.061371 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.061423 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58w4r\" (UniqueName: \"kubernetes.io/projected/3ff8c46f-3adf-4c25-8ab2-de6920f49542-kube-api-access-58w4r\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.061439 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.061447 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ff8c46f-3adf-4c25-8ab2-de6920f49542-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.309828 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" event={"ID":"3ff8c46f-3adf-4c25-8ab2-de6920f49542","Type":"ContainerDied","Data":"6cceca8034664e30858ace0d680effe7aa2f8bdab18e13bcce26de58fdb754b1"} Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.309869 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cceca8034664e30858ace0d680effe7aa2f8bdab18e13bcce26de58fdb754b1" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.309884 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-jbpgq" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.419976 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-k6sr2"] Oct 10 08:30:56 crc kubenswrapper[4822]: E1010 08:30:56.420881 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="extract-utilities" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.420902 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="extract-utilities" Oct 10 08:30:56 crc kubenswrapper[4822]: E1010 08:30:56.420936 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="registry-server" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.420945 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="registry-server" Oct 10 08:30:56 crc kubenswrapper[4822]: E1010 08:30:56.420967 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff8c46f-3adf-4c25-8ab2-de6920f49542" containerName="install-os-openstack-openstack-cell1" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.420976 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff8c46f-3adf-4c25-8ab2-de6920f49542" containerName="install-os-openstack-openstack-cell1" Oct 10 08:30:56 crc kubenswrapper[4822]: E1010 08:30:56.420992 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="extract-content" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.421000 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="extract-content" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.421263 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff8c46f-3adf-4c25-8ab2-de6920f49542" containerName="install-os-openstack-openstack-cell1" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.421290 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b05725-5db6-4e68-81e9-c44ffd0aa486" containerName="registry-server" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.422055 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.426087 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.426550 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.427447 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.430219 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.437880 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-k6sr2"] Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.575922 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ssh-key\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.576295 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ceph\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.576414 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-inventory\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.576693 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpf92\" (UniqueName: \"kubernetes.io/projected/ff18ff5e-22b5-464b-a2f0-f879bc31db11-kube-api-access-vpf92\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.678979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpf92\" (UniqueName: \"kubernetes.io/projected/ff18ff5e-22b5-464b-a2f0-f879bc31db11-kube-api-access-vpf92\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.679114 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ssh-key\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.679209 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ceph\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.679248 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-inventory\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.682942 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ssh-key\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.694750 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ceph\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.695600 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-inventory\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.698584 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpf92\" (UniqueName: \"kubernetes.io/projected/ff18ff5e-22b5-464b-a2f0-f879bc31db11-kube-api-access-vpf92\") pod \"configure-os-openstack-openstack-cell1-k6sr2\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:56 crc kubenswrapper[4822]: I1010 08:30:56.743508 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:30:57 crc kubenswrapper[4822]: I1010 08:30:57.278486 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-k6sr2"] Oct 10 08:30:57 crc kubenswrapper[4822]: I1010 08:30:57.324676 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" event={"ID":"ff18ff5e-22b5-464b-a2f0-f879bc31db11","Type":"ContainerStarted","Data":"37f86e93983a7eb4aef25571812edeb56f603b52980d2959d04315e1d2b7849b"} Oct 10 08:30:59 crc kubenswrapper[4822]: I1010 08:30:59.346461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" event={"ID":"ff18ff5e-22b5-464b-a2f0-f879bc31db11","Type":"ContainerStarted","Data":"c538668377bbb40228b523adc8d4e0681b51e740b33657ad9b5ee6fa5b0823da"} Oct 10 08:30:59 crc kubenswrapper[4822]: I1010 08:30:59.379225 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" podStartSLOduration=2.5615292739999997 podStartE2EDuration="3.379206682s" podCreationTimestamp="2025-10-10 08:30:56 +0000 UTC" firstStartedPulling="2025-10-10 08:30:57.279046477 +0000 UTC m=+7604.374204673" lastFinishedPulling="2025-10-10 08:30:58.096723885 +0000 UTC m=+7605.191882081" observedRunningTime="2025-10-10 08:30:59.372510079 +0000 UTC m=+7606.467668295" watchObservedRunningTime="2025-10-10 08:30:59.379206682 +0000 UTC m=+7606.474364878" Oct 10 08:31:01 crc kubenswrapper[4822]: I1010 08:31:01.336774 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:31:01 crc kubenswrapper[4822]: I1010 08:31:01.337241 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:31:01 crc kubenswrapper[4822]: I1010 08:31:01.337301 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:31:01 crc kubenswrapper[4822]: I1010 08:31:01.338377 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"256f45c7ea0d8da7f0201183c607d7f15e72b8de5f8d9482c782fb16d58eaeb7"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:31:01 crc kubenswrapper[4822]: I1010 08:31:01.338449 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://256f45c7ea0d8da7f0201183c607d7f15e72b8de5f8d9482c782fb16d58eaeb7" gracePeriod=600 Oct 10 08:31:02 crc kubenswrapper[4822]: I1010 08:31:02.377265 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="256f45c7ea0d8da7f0201183c607d7f15e72b8de5f8d9482c782fb16d58eaeb7" exitCode=0 Oct 10 08:31:02 crc kubenswrapper[4822]: I1010 08:31:02.377342 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"256f45c7ea0d8da7f0201183c607d7f15e72b8de5f8d9482c782fb16d58eaeb7"} Oct 10 08:31:02 crc kubenswrapper[4822]: I1010 08:31:02.378063 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1"} Oct 10 08:31:02 crc kubenswrapper[4822]: I1010 08:31:02.378110 4822 scope.go:117] "RemoveContainer" containerID="cec1a306f00bcbdd1211339188579ce66815030872a26d90e672d91eccbb7293" Oct 10 08:31:43 crc kubenswrapper[4822]: I1010 08:31:43.820628 4822 generic.go:334] "Generic (PLEG): container finished" podID="ff18ff5e-22b5-464b-a2f0-f879bc31db11" containerID="c538668377bbb40228b523adc8d4e0681b51e740b33657ad9b5ee6fa5b0823da" exitCode=0 Oct 10 08:31:43 crc kubenswrapper[4822]: I1010 08:31:43.820719 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" event={"ID":"ff18ff5e-22b5-464b-a2f0-f879bc31db11","Type":"ContainerDied","Data":"c538668377bbb40228b523adc8d4e0681b51e740b33657ad9b5ee6fa5b0823da"} Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.278720 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.403605 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-inventory\") pod \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.403720 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ssh-key\") pod \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.403871 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpf92\" (UniqueName: \"kubernetes.io/projected/ff18ff5e-22b5-464b-a2f0-f879bc31db11-kube-api-access-vpf92\") pod \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.403949 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ceph\") pod \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\" (UID: \"ff18ff5e-22b5-464b-a2f0-f879bc31db11\") " Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.410104 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ceph" (OuterVolumeSpecName: "ceph") pod "ff18ff5e-22b5-464b-a2f0-f879bc31db11" (UID: "ff18ff5e-22b5-464b-a2f0-f879bc31db11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.415536 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff18ff5e-22b5-464b-a2f0-f879bc31db11-kube-api-access-vpf92" (OuterVolumeSpecName: "kube-api-access-vpf92") pod "ff18ff5e-22b5-464b-a2f0-f879bc31db11" (UID: "ff18ff5e-22b5-464b-a2f0-f879bc31db11"). InnerVolumeSpecName "kube-api-access-vpf92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.442898 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-inventory" (OuterVolumeSpecName: "inventory") pod "ff18ff5e-22b5-464b-a2f0-f879bc31db11" (UID: "ff18ff5e-22b5-464b-a2f0-f879bc31db11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.443439 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ff18ff5e-22b5-464b-a2f0-f879bc31db11" (UID: "ff18ff5e-22b5-464b-a2f0-f879bc31db11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.506657 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpf92\" (UniqueName: \"kubernetes.io/projected/ff18ff5e-22b5-464b-a2f0-f879bc31db11-kube-api-access-vpf92\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.506733 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.506755 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.506763 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff18ff5e-22b5-464b-a2f0-f879bc31db11-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.843737 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" event={"ID":"ff18ff5e-22b5-464b-a2f0-f879bc31db11","Type":"ContainerDied","Data":"37f86e93983a7eb4aef25571812edeb56f603b52980d2959d04315e1d2b7849b"} Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.844184 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f86e93983a7eb4aef25571812edeb56f603b52980d2959d04315e1d2b7849b" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.843842 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k6sr2" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.924924 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-8tnqf"] Oct 10 08:31:45 crc kubenswrapper[4822]: E1010 08:31:45.925388 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff18ff5e-22b5-464b-a2f0-f879bc31db11" containerName="configure-os-openstack-openstack-cell1" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.925404 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff18ff5e-22b5-464b-a2f0-f879bc31db11" containerName="configure-os-openstack-openstack-cell1" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.925648 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff18ff5e-22b5-464b-a2f0-f879bc31db11" containerName="configure-os-openstack-openstack-cell1" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.926611 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.928541 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.928873 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.930820 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.932719 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:31:45 crc kubenswrapper[4822]: I1010 08:31:45.937184 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-8tnqf"] Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.017647 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.018061 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-inventory-0\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.018177 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw22v\" (UniqueName: \"kubernetes.io/projected/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-kube-api-access-gw22v\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.018358 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ceph\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.120543 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-inventory-0\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.120667 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw22v\" (UniqueName: \"kubernetes.io/projected/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-kube-api-access-gw22v\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.120760 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ceph\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.120833 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.128502 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-inventory-0\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.128961 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.137103 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ceph\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.137951 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw22v\" (UniqueName: \"kubernetes.io/projected/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-kube-api-access-gw22v\") pod \"ssh-known-hosts-openstack-8tnqf\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.246143 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.801220 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-8tnqf"] Oct 10 08:31:46 crc kubenswrapper[4822]: I1010 08:31:46.854966 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8tnqf" event={"ID":"43ed3e6d-8ebe-4534-9db2-d84e95cf0748","Type":"ContainerStarted","Data":"90914239d37eff6cfc7dcc4ab4b44d4d2004d0343fe9b3c2343ff9aa11f9184c"} Oct 10 08:31:47 crc kubenswrapper[4822]: I1010 08:31:47.867266 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8tnqf" event={"ID":"43ed3e6d-8ebe-4534-9db2-d84e95cf0748","Type":"ContainerStarted","Data":"932bf3a0e0ad494147bc8285025cc98895e00ce16cd4ba10495fa398adc6cbf2"} Oct 10 08:31:47 crc kubenswrapper[4822]: I1010 08:31:47.894312 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-8tnqf" podStartSLOduration=2.437129433 podStartE2EDuration="2.894291801s" podCreationTimestamp="2025-10-10 08:31:45 +0000 UTC" firstStartedPulling="2025-10-10 08:31:46.805498796 +0000 UTC m=+7653.900656992" lastFinishedPulling="2025-10-10 08:31:47.262661154 +0000 UTC m=+7654.357819360" observedRunningTime="2025-10-10 08:31:47.883451199 +0000 UTC m=+7654.978609415" watchObservedRunningTime="2025-10-10 08:31:47.894291801 +0000 UTC m=+7654.989449997" Oct 10 08:31:55 crc kubenswrapper[4822]: I1010 08:31:55.961867 4822 generic.go:334] "Generic (PLEG): container finished" podID="43ed3e6d-8ebe-4534-9db2-d84e95cf0748" containerID="932bf3a0e0ad494147bc8285025cc98895e00ce16cd4ba10495fa398adc6cbf2" exitCode=0 Oct 10 08:31:55 crc kubenswrapper[4822]: I1010 08:31:55.961951 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8tnqf" event={"ID":"43ed3e6d-8ebe-4534-9db2-d84e95cf0748","Type":"ContainerDied","Data":"932bf3a0e0ad494147bc8285025cc98895e00ce16cd4ba10495fa398adc6cbf2"} Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.457856 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.478372 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-inventory-0\") pod \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.478453 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ceph\") pod \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.478656 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ssh-key-openstack-cell1\") pod \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.478696 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw22v\" (UniqueName: \"kubernetes.io/projected/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-kube-api-access-gw22v\") pod \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\" (UID: \"43ed3e6d-8ebe-4534-9db2-d84e95cf0748\") " Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.486377 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ceph" (OuterVolumeSpecName: "ceph") pod "43ed3e6d-8ebe-4534-9db2-d84e95cf0748" (UID: "43ed3e6d-8ebe-4534-9db2-d84e95cf0748"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.490966 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-kube-api-access-gw22v" (OuterVolumeSpecName: "kube-api-access-gw22v") pod "43ed3e6d-8ebe-4534-9db2-d84e95cf0748" (UID: "43ed3e6d-8ebe-4534-9db2-d84e95cf0748"). InnerVolumeSpecName "kube-api-access-gw22v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.535839 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "43ed3e6d-8ebe-4534-9db2-d84e95cf0748" (UID: "43ed3e6d-8ebe-4534-9db2-d84e95cf0748"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.535864 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "43ed3e6d-8ebe-4534-9db2-d84e95cf0748" (UID: "43ed3e6d-8ebe-4534-9db2-d84e95cf0748"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.580228 4822 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.580260 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.580270 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.580281 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw22v\" (UniqueName: \"kubernetes.io/projected/43ed3e6d-8ebe-4534-9db2-d84e95cf0748-kube-api-access-gw22v\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.983997 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8tnqf" event={"ID":"43ed3e6d-8ebe-4534-9db2-d84e95cf0748","Type":"ContainerDied","Data":"90914239d37eff6cfc7dcc4ab4b44d4d2004d0343fe9b3c2343ff9aa11f9184c"} Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.984039 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90914239d37eff6cfc7dcc4ab4b44d4d2004d0343fe9b3c2343ff9aa11f9184c" Oct 10 08:31:57 crc kubenswrapper[4822]: I1010 08:31:57.984046 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8tnqf" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.059226 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-ttd5s"] Oct 10 08:31:58 crc kubenswrapper[4822]: E1010 08:31:58.060538 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ed3e6d-8ebe-4534-9db2-d84e95cf0748" containerName="ssh-known-hosts-openstack" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.060563 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed3e6d-8ebe-4534-9db2-d84e95cf0748" containerName="ssh-known-hosts-openstack" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.061142 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ed3e6d-8ebe-4534-9db2-d84e95cf0748" containerName="ssh-known-hosts-openstack" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.062454 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.064836 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.065235 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.068105 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.069114 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.094111 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-ttd5s"] Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.192533 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-inventory\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.192908 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pprcs\" (UniqueName: \"kubernetes.io/projected/6114ecb9-28a8-4e70-96a1-ed43697c60b8-kube-api-access-pprcs\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.192949 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ssh-key\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.193009 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ceph\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.295182 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-inventory\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.295259 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pprcs\" (UniqueName: \"kubernetes.io/projected/6114ecb9-28a8-4e70-96a1-ed43697c60b8-kube-api-access-pprcs\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.295310 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ssh-key\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.295375 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ceph\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.300351 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ssh-key\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.300439 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-inventory\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.300706 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ceph\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.312330 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pprcs\" (UniqueName: \"kubernetes.io/projected/6114ecb9-28a8-4e70-96a1-ed43697c60b8-kube-api-access-pprcs\") pod \"run-os-openstack-openstack-cell1-ttd5s\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.402639 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.911741 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-ttd5s"] Oct 10 08:31:58 crc kubenswrapper[4822]: I1010 08:31:58.994286 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" event={"ID":"6114ecb9-28a8-4e70-96a1-ed43697c60b8","Type":"ContainerStarted","Data":"2bad9dd00866719c270aa7bf35ccebbb858571de1359db2e9bd90de3c6f693fa"} Oct 10 08:32:00 crc kubenswrapper[4822]: I1010 08:32:00.011438 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" event={"ID":"6114ecb9-28a8-4e70-96a1-ed43697c60b8","Type":"ContainerStarted","Data":"5481c78d9e764e543a408746bb7cd8a59a11730d98f285d2806e9ef8206cb1d3"} Oct 10 08:32:00 crc kubenswrapper[4822]: I1010 08:32:00.048405 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" podStartSLOduration=1.600513734 podStartE2EDuration="2.048372855s" podCreationTimestamp="2025-10-10 08:31:58 +0000 UTC" firstStartedPulling="2025-10-10 08:31:58.916223872 +0000 UTC m=+7666.011382068" lastFinishedPulling="2025-10-10 08:31:59.364082983 +0000 UTC m=+7666.459241189" observedRunningTime="2025-10-10 08:32:00.043249317 +0000 UTC m=+7667.138407513" watchObservedRunningTime="2025-10-10 08:32:00.048372855 +0000 UTC m=+7667.143531071" Oct 10 08:32:09 crc kubenswrapper[4822]: I1010 08:32:09.108982 4822 generic.go:334] "Generic (PLEG): container finished" podID="6114ecb9-28a8-4e70-96a1-ed43697c60b8" containerID="5481c78d9e764e543a408746bb7cd8a59a11730d98f285d2806e9ef8206cb1d3" exitCode=0 Oct 10 08:32:09 crc kubenswrapper[4822]: I1010 08:32:09.109082 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" event={"ID":"6114ecb9-28a8-4e70-96a1-ed43697c60b8","Type":"ContainerDied","Data":"5481c78d9e764e543a408746bb7cd8a59a11730d98f285d2806e9ef8206cb1d3"} Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.621887 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.770727 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ssh-key\") pod \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.770874 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pprcs\" (UniqueName: \"kubernetes.io/projected/6114ecb9-28a8-4e70-96a1-ed43697c60b8-kube-api-access-pprcs\") pod \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.770990 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ceph\") pod \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.771038 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-inventory\") pod \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\" (UID: \"6114ecb9-28a8-4e70-96a1-ed43697c60b8\") " Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.780480 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ceph" (OuterVolumeSpecName: "ceph") pod "6114ecb9-28a8-4e70-96a1-ed43697c60b8" (UID: "6114ecb9-28a8-4e70-96a1-ed43697c60b8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.780776 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6114ecb9-28a8-4e70-96a1-ed43697c60b8-kube-api-access-pprcs" (OuterVolumeSpecName: "kube-api-access-pprcs") pod "6114ecb9-28a8-4e70-96a1-ed43697c60b8" (UID: "6114ecb9-28a8-4e70-96a1-ed43697c60b8"). InnerVolumeSpecName "kube-api-access-pprcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.799991 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6114ecb9-28a8-4e70-96a1-ed43697c60b8" (UID: "6114ecb9-28a8-4e70-96a1-ed43697c60b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.801781 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-inventory" (OuterVolumeSpecName: "inventory") pod "6114ecb9-28a8-4e70-96a1-ed43697c60b8" (UID: "6114ecb9-28a8-4e70-96a1-ed43697c60b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.873968 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pprcs\" (UniqueName: \"kubernetes.io/projected/6114ecb9-28a8-4e70-96a1-ed43697c60b8-kube-api-access-pprcs\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.874009 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.874020 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:10 crc kubenswrapper[4822]: I1010 08:32:10.874028 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6114ecb9-28a8-4e70-96a1-ed43697c60b8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.155565 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" event={"ID":"6114ecb9-28a8-4e70-96a1-ed43697c60b8","Type":"ContainerDied","Data":"2bad9dd00866719c270aa7bf35ccebbb858571de1359db2e9bd90de3c6f693fa"} Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.155911 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bad9dd00866719c270aa7bf35ccebbb858571de1359db2e9bd90de3c6f693fa" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.155772 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-ttd5s" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.272592 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-962c2"] Oct 10 08:32:11 crc kubenswrapper[4822]: E1010 08:32:11.274131 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6114ecb9-28a8-4e70-96a1-ed43697c60b8" containerName="run-os-openstack-openstack-cell1" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.274155 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6114ecb9-28a8-4e70-96a1-ed43697c60b8" containerName="run-os-openstack-openstack-cell1" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.275673 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6114ecb9-28a8-4e70-96a1-ed43697c60b8" containerName="run-os-openstack-openstack-cell1" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.278600 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.281948 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.282209 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.282362 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.283253 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.284980 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-962c2"] Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.387657 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47bds\" (UniqueName: \"kubernetes.io/projected/71cdd189-590b-495e-b841-83ab3662979f-kube-api-access-47bds\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.388014 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ceph\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.388564 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-inventory\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.388661 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.490711 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-inventory\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.490769 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.490832 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47bds\" (UniqueName: \"kubernetes.io/projected/71cdd189-590b-495e-b841-83ab3662979f-kube-api-access-47bds\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.490891 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ceph\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.495716 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-inventory\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.496652 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.499076 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ceph\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.507736 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47bds\" (UniqueName: \"kubernetes.io/projected/71cdd189-590b-495e-b841-83ab3662979f-kube-api-access-47bds\") pod \"reboot-os-openstack-openstack-cell1-962c2\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:11 crc kubenswrapper[4822]: I1010 08:32:11.606203 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:12 crc kubenswrapper[4822]: I1010 08:32:12.195288 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-962c2"] Oct 10 08:32:13 crc kubenswrapper[4822]: I1010 08:32:13.180624 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" event={"ID":"71cdd189-590b-495e-b841-83ab3662979f","Type":"ContainerStarted","Data":"b89911177f195ba144bffe2de2a1c032b7fa9cd8bd0964232b2ca81a8e9b9a95"} Oct 10 08:32:13 crc kubenswrapper[4822]: I1010 08:32:13.180683 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" event={"ID":"71cdd189-590b-495e-b841-83ab3662979f","Type":"ContainerStarted","Data":"ab26f9fa32a12db8009457c2afaf30f23ee8164556114e09d4e590930aa1bc51"} Oct 10 08:32:13 crc kubenswrapper[4822]: I1010 08:32:13.207324 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" podStartSLOduration=1.7205054199999998 podStartE2EDuration="2.20728831s" podCreationTimestamp="2025-10-10 08:32:11 +0000 UTC" firstStartedPulling="2025-10-10 08:32:12.204478128 +0000 UTC m=+7679.299636334" lastFinishedPulling="2025-10-10 08:32:12.691261028 +0000 UTC m=+7679.786419224" observedRunningTime="2025-10-10 08:32:13.20589104 +0000 UTC m=+7680.301049286" watchObservedRunningTime="2025-10-10 08:32:13.20728831 +0000 UTC m=+7680.302446506" Oct 10 08:32:29 crc kubenswrapper[4822]: I1010 08:32:29.364775 4822 generic.go:334] "Generic (PLEG): container finished" podID="71cdd189-590b-495e-b841-83ab3662979f" containerID="b89911177f195ba144bffe2de2a1c032b7fa9cd8bd0964232b2ca81a8e9b9a95" exitCode=0 Oct 10 08:32:29 crc kubenswrapper[4822]: I1010 08:32:29.364933 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" event={"ID":"71cdd189-590b-495e-b841-83ab3662979f","Type":"ContainerDied","Data":"b89911177f195ba144bffe2de2a1c032b7fa9cd8bd0964232b2ca81a8e9b9a95"} Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.853359 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.958455 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47bds\" (UniqueName: \"kubernetes.io/projected/71cdd189-590b-495e-b841-83ab3662979f-kube-api-access-47bds\") pod \"71cdd189-590b-495e-b841-83ab3662979f\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.959618 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ceph\") pod \"71cdd189-590b-495e-b841-83ab3662979f\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.960213 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-inventory\") pod \"71cdd189-590b-495e-b841-83ab3662979f\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.960577 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ssh-key\") pod \"71cdd189-590b-495e-b841-83ab3662979f\" (UID: \"71cdd189-590b-495e-b841-83ab3662979f\") " Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.965301 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cdd189-590b-495e-b841-83ab3662979f-kube-api-access-47bds" (OuterVolumeSpecName: "kube-api-access-47bds") pod "71cdd189-590b-495e-b841-83ab3662979f" (UID: "71cdd189-590b-495e-b841-83ab3662979f"). InnerVolumeSpecName "kube-api-access-47bds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.966279 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ceph" (OuterVolumeSpecName: "ceph") pod "71cdd189-590b-495e-b841-83ab3662979f" (UID: "71cdd189-590b-495e-b841-83ab3662979f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.989146 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-inventory" (OuterVolumeSpecName: "inventory") pod "71cdd189-590b-495e-b841-83ab3662979f" (UID: "71cdd189-590b-495e-b841-83ab3662979f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:30 crc kubenswrapper[4822]: I1010 08:32:30.990863 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71cdd189-590b-495e-b841-83ab3662979f" (UID: "71cdd189-590b-495e-b841-83ab3662979f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.064359 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.064393 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47bds\" (UniqueName: \"kubernetes.io/projected/71cdd189-590b-495e-b841-83ab3662979f-kube-api-access-47bds\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.064406 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.064414 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71cdd189-590b-495e-b841-83ab3662979f-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.388493 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" event={"ID":"71cdd189-590b-495e-b841-83ab3662979f","Type":"ContainerDied","Data":"ab26f9fa32a12db8009457c2afaf30f23ee8164556114e09d4e590930aa1bc51"} Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.388533 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab26f9fa32a12db8009457c2afaf30f23ee8164556114e09d4e590930aa1bc51" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.388595 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-962c2" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.524588 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-c4wdc"] Oct 10 08:32:31 crc kubenswrapper[4822]: E1010 08:32:31.525186 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cdd189-590b-495e-b841-83ab3662979f" containerName="reboot-os-openstack-openstack-cell1" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.525210 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cdd189-590b-495e-b841-83ab3662979f" containerName="reboot-os-openstack-openstack-cell1" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.525475 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cdd189-590b-495e-b841-83ab3662979f" containerName="reboot-os-openstack-openstack-cell1" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.526345 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.528311 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.528580 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.528946 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.529773 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.535176 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-c4wdc"] Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.676682 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.676762 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677027 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677159 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677255 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqfgr\" (UniqueName: \"kubernetes.io/projected/628e74c8-1d98-4722-906f-588c48bea3fc-kube-api-access-nqfgr\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677294 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677319 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ssh-key\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677527 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677604 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ceph\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677652 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.677836 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.678159 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-inventory\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.779653 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqfgr\" (UniqueName: \"kubernetes.io/projected/628e74c8-1d98-4722-906f-588c48bea3fc-kube-api-access-nqfgr\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.779737 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.779763 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ssh-key\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.779789 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.779960 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ceph\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.780498 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.780528 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.780662 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-inventory\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.780712 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.781243 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.781407 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.781503 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.785568 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.785616 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ceph\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.786002 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ssh-key\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.786606 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.787013 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.789368 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.792098 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.792217 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.794600 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.796565 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.797710 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqfgr\" (UniqueName: \"kubernetes.io/projected/628e74c8-1d98-4722-906f-588c48bea3fc-kube-api-access-nqfgr\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.797762 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-inventory\") pod \"install-certs-openstack-openstack-cell1-c4wdc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:31 crc kubenswrapper[4822]: I1010 08:32:31.845646 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:33 crc kubenswrapper[4822]: I1010 08:32:33.017571 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-c4wdc"] Oct 10 08:32:33 crc kubenswrapper[4822]: I1010 08:32:33.407461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" event={"ID":"628e74c8-1d98-4722-906f-588c48bea3fc","Type":"ContainerStarted","Data":"a18f7d133e71150539977c4a9ae1a41e746f2f4a10343d725a2f9275ca8fcec4"} Oct 10 08:32:34 crc kubenswrapper[4822]: I1010 08:32:34.419926 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" event={"ID":"628e74c8-1d98-4722-906f-588c48bea3fc","Type":"ContainerStarted","Data":"094ad8ea616ac2b59537a8dd92c4b11c2059753672e12271a5abd6d5f9836bc2"} Oct 10 08:32:34 crc kubenswrapper[4822]: I1010 08:32:34.445532 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" podStartSLOduration=2.989323351 podStartE2EDuration="3.445511181s" podCreationTimestamp="2025-10-10 08:32:31 +0000 UTC" firstStartedPulling="2025-10-10 08:32:33.024741308 +0000 UTC m=+7700.119899514" lastFinishedPulling="2025-10-10 08:32:33.480929128 +0000 UTC m=+7700.576087344" observedRunningTime="2025-10-10 08:32:34.444638126 +0000 UTC m=+7701.539796382" watchObservedRunningTime="2025-10-10 08:32:34.445511181 +0000 UTC m=+7701.540669377" Oct 10 08:32:52 crc kubenswrapper[4822]: I1010 08:32:52.636891 4822 generic.go:334] "Generic (PLEG): container finished" podID="628e74c8-1d98-4722-906f-588c48bea3fc" containerID="094ad8ea616ac2b59537a8dd92c4b11c2059753672e12271a5abd6d5f9836bc2" exitCode=0 Oct 10 08:32:52 crc kubenswrapper[4822]: I1010 08:32:52.636988 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" event={"ID":"628e74c8-1d98-4722-906f-588c48bea3fc","Type":"ContainerDied","Data":"094ad8ea616ac2b59537a8dd92c4b11c2059753672e12271a5abd6d5f9836bc2"} Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.118234 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219516 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqfgr\" (UniqueName: \"kubernetes.io/projected/628e74c8-1d98-4722-906f-588c48bea3fc-kube-api-access-nqfgr\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219580 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-libvirt-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219603 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-metadata-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219670 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ceph\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219709 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-inventory\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219737 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ssh-key\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219787 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-telemetry-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219819 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-nova-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219867 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-dhcp-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219906 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-bootstrap-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.219995 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-sriov-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.220049 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ovn-combined-ca-bundle\") pod \"628e74c8-1d98-4722-906f-588c48bea3fc\" (UID: \"628e74c8-1d98-4722-906f-588c48bea3fc\") " Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.268926 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.269058 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.269187 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.270972 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.272961 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628e74c8-1d98-4722-906f-588c48bea3fc-kube-api-access-nqfgr" (OuterVolumeSpecName: "kube-api-access-nqfgr") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "kube-api-access-nqfgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.273037 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.273089 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.274053 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.290172 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.294053 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ceph" (OuterVolumeSpecName: "ceph") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325344 4822 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325383 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325394 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325406 4822 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325418 4822 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325426 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325435 4822 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325445 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325453 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.325461 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqfgr\" (UniqueName: \"kubernetes.io/projected/628e74c8-1d98-4722-906f-588c48bea3fc-kube-api-access-nqfgr\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.342033 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-inventory" (OuterVolumeSpecName: "inventory") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.382187 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "628e74c8-1d98-4722-906f-588c48bea3fc" (UID: "628e74c8-1d98-4722-906f-588c48bea3fc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.427348 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.427382 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/628e74c8-1d98-4722-906f-588c48bea3fc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.657925 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" event={"ID":"628e74c8-1d98-4722-906f-588c48bea3fc","Type":"ContainerDied","Data":"a18f7d133e71150539977c4a9ae1a41e746f2f4a10343d725a2f9275ca8fcec4"} Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.657976 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18f7d133e71150539977c4a9ae1a41e746f2f4a10343d725a2f9275ca8fcec4" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.657998 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-c4wdc" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.810035 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-b427v"] Oct 10 08:32:54 crc kubenswrapper[4822]: E1010 08:32:54.810491 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628e74c8-1d98-4722-906f-588c48bea3fc" containerName="install-certs-openstack-openstack-cell1" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.810509 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="628e74c8-1d98-4722-906f-588c48bea3fc" containerName="install-certs-openstack-openstack-cell1" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.810714 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="628e74c8-1d98-4722-906f-588c48bea3fc" containerName="install-certs-openstack-openstack-cell1" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.811425 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.814007 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.814033 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.814129 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.814273 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.829142 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-b427v"] Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.940583 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ceph\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.941252 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.941317 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4k5\" (UniqueName: \"kubernetes.io/projected/2c56d568-594f-4562-bfd2-db0617239e9c-kube-api-access-4b4k5\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:54 crc kubenswrapper[4822]: I1010 08:32:54.941462 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-inventory\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.044190 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ceph\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.044331 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.044430 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4k5\" (UniqueName: \"kubernetes.io/projected/2c56d568-594f-4562-bfd2-db0617239e9c-kube-api-access-4b4k5\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.044570 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-inventory\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.053336 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-inventory\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.054082 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.062560 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ceph\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.066554 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4k5\" (UniqueName: \"kubernetes.io/projected/2c56d568-594f-4562-bfd2-db0617239e9c-kube-api-access-4b4k5\") pod \"ceph-client-openstack-openstack-cell1-b427v\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.139666 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:32:55 crc kubenswrapper[4822]: I1010 08:32:55.712412 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-b427v"] Oct 10 08:32:56 crc kubenswrapper[4822]: I1010 08:32:56.704697 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" event={"ID":"2c56d568-594f-4562-bfd2-db0617239e9c","Type":"ContainerStarted","Data":"cffafbe7cef272a5b86f76a92e004ad2394e2ddaab877ed6206b2f9ff6d58ad0"} Oct 10 08:32:56 crc kubenswrapper[4822]: I1010 08:32:56.705251 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" event={"ID":"2c56d568-594f-4562-bfd2-db0617239e9c","Type":"ContainerStarted","Data":"0f8b6b0b3b40b3468d571fdcd732cec2db31327e7dca957587abc6ed2b29c358"} Oct 10 08:32:56 crc kubenswrapper[4822]: I1010 08:32:56.731604 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" podStartSLOduration=2.254350476 podStartE2EDuration="2.731584582s" podCreationTimestamp="2025-10-10 08:32:54 +0000 UTC" firstStartedPulling="2025-10-10 08:32:55.716476255 +0000 UTC m=+7722.811634451" lastFinishedPulling="2025-10-10 08:32:56.193710361 +0000 UTC m=+7723.288868557" observedRunningTime="2025-10-10 08:32:56.727094553 +0000 UTC m=+7723.822252749" watchObservedRunningTime="2025-10-10 08:32:56.731584582 +0000 UTC m=+7723.826742788" Oct 10 08:33:01 crc kubenswrapper[4822]: I1010 08:33:01.336970 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:33:01 crc kubenswrapper[4822]: I1010 08:33:01.337398 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:33:01 crc kubenswrapper[4822]: I1010 08:33:01.751926 4822 generic.go:334] "Generic (PLEG): container finished" podID="2c56d568-594f-4562-bfd2-db0617239e9c" containerID="cffafbe7cef272a5b86f76a92e004ad2394e2ddaab877ed6206b2f9ff6d58ad0" exitCode=0 Oct 10 08:33:01 crc kubenswrapper[4822]: I1010 08:33:01.751979 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" event={"ID":"2c56d568-594f-4562-bfd2-db0617239e9c","Type":"ContainerDied","Data":"cffafbe7cef272a5b86f76a92e004ad2394e2ddaab877ed6206b2f9ff6d58ad0"} Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.272679 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.352288 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-inventory\") pod \"2c56d568-594f-4562-bfd2-db0617239e9c\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.352713 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ssh-key\") pod \"2c56d568-594f-4562-bfd2-db0617239e9c\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.352947 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b4k5\" (UniqueName: \"kubernetes.io/projected/2c56d568-594f-4562-bfd2-db0617239e9c-kube-api-access-4b4k5\") pod \"2c56d568-594f-4562-bfd2-db0617239e9c\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.353046 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ceph\") pod \"2c56d568-594f-4562-bfd2-db0617239e9c\" (UID: \"2c56d568-594f-4562-bfd2-db0617239e9c\") " Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.358544 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c56d568-594f-4562-bfd2-db0617239e9c-kube-api-access-4b4k5" (OuterVolumeSpecName: "kube-api-access-4b4k5") pod "2c56d568-594f-4562-bfd2-db0617239e9c" (UID: "2c56d568-594f-4562-bfd2-db0617239e9c"). InnerVolumeSpecName "kube-api-access-4b4k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.361502 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ceph" (OuterVolumeSpecName: "ceph") pod "2c56d568-594f-4562-bfd2-db0617239e9c" (UID: "2c56d568-594f-4562-bfd2-db0617239e9c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.386409 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-inventory" (OuterVolumeSpecName: "inventory") pod "2c56d568-594f-4562-bfd2-db0617239e9c" (UID: "2c56d568-594f-4562-bfd2-db0617239e9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.391145 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2c56d568-594f-4562-bfd2-db0617239e9c" (UID: "2c56d568-594f-4562-bfd2-db0617239e9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.455098 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.455138 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b4k5\" (UniqueName: \"kubernetes.io/projected/2c56d568-594f-4562-bfd2-db0617239e9c-kube-api-access-4b4k5\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.455149 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.455157 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c56d568-594f-4562-bfd2-db0617239e9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.773981 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" event={"ID":"2c56d568-594f-4562-bfd2-db0617239e9c","Type":"ContainerDied","Data":"0f8b6b0b3b40b3468d571fdcd732cec2db31327e7dca957587abc6ed2b29c358"} Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.774072 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8b6b0b3b40b3468d571fdcd732cec2db31327e7dca957587abc6ed2b29c358" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.774064 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-b427v" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.854982 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-m7m8g"] Oct 10 08:33:03 crc kubenswrapper[4822]: E1010 08:33:03.855420 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c56d568-594f-4562-bfd2-db0617239e9c" containerName="ceph-client-openstack-openstack-cell1" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.855438 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c56d568-594f-4562-bfd2-db0617239e9c" containerName="ceph-client-openstack-openstack-cell1" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.855635 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c56d568-594f-4562-bfd2-db0617239e9c" containerName="ceph-client-openstack-openstack-cell1" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.856414 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.858195 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.858570 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.858618 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.858826 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.859876 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.869684 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-m7m8g"] Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.965296 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.965452 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ceph\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.965506 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.965537 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ssh-key\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.965746 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cppq8\" (UniqueName: \"kubernetes.io/projected/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-kube-api-access-cppq8\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:03 crc kubenswrapper[4822]: I1010 08:33:03.965913 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-inventory\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.068189 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ceph\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.068280 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.068317 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ssh-key\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.068387 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cppq8\" (UniqueName: \"kubernetes.io/projected/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-kube-api-access-cppq8\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.068432 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-inventory\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.068519 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.069913 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.072257 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ceph\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.073202 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ssh-key\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.073609 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.074702 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-inventory\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.087395 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cppq8\" (UniqueName: \"kubernetes.io/projected/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-kube-api-access-cppq8\") pod \"ovn-openstack-openstack-cell1-m7m8g\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.222440 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:33:04 crc kubenswrapper[4822]: I1010 08:33:04.786689 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-m7m8g"] Oct 10 08:33:05 crc kubenswrapper[4822]: I1010 08:33:05.801765 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" event={"ID":"9008d029-8eb9-482b-bd9a-fd5bc2259bc1","Type":"ContainerStarted","Data":"b6e29b2683ec90cea097c5b17fea1201c4f7359495d51d6b4a7fa0a35a2be398"} Oct 10 08:33:05 crc kubenswrapper[4822]: I1010 08:33:05.802144 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" event={"ID":"9008d029-8eb9-482b-bd9a-fd5bc2259bc1","Type":"ContainerStarted","Data":"844ffd93ea815c0181d5bdab985a97abea1e0fd061331f34dc0ea24a1413ecdb"} Oct 10 08:33:05 crc kubenswrapper[4822]: I1010 08:33:05.820113 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" podStartSLOduration=2.125902987 podStartE2EDuration="2.820089654s" podCreationTimestamp="2025-10-10 08:33:03 +0000 UTC" firstStartedPulling="2025-10-10 08:33:04.792748066 +0000 UTC m=+7731.887906262" lastFinishedPulling="2025-10-10 08:33:05.486934733 +0000 UTC m=+7732.582092929" observedRunningTime="2025-10-10 08:33:05.819998651 +0000 UTC m=+7732.915156847" watchObservedRunningTime="2025-10-10 08:33:05.820089654 +0000 UTC m=+7732.915247870" Oct 10 08:33:31 crc kubenswrapper[4822]: I1010 08:33:31.337401 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:33:31 crc kubenswrapper[4822]: I1010 08:33:31.338160 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:34:01 crc kubenswrapper[4822]: I1010 08:34:01.336235 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:34:01 crc kubenswrapper[4822]: I1010 08:34:01.336764 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:34:01 crc kubenswrapper[4822]: I1010 08:34:01.336821 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:34:01 crc kubenswrapper[4822]: I1010 08:34:01.337680 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:34:01 crc kubenswrapper[4822]: I1010 08:34:01.337741 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" gracePeriod=600 Oct 10 08:34:01 crc kubenswrapper[4822]: E1010 08:34:01.458922 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:34:02 crc kubenswrapper[4822]: I1010 08:34:02.420551 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" exitCode=0 Oct 10 08:34:02 crc kubenswrapper[4822]: I1010 08:34:02.420624 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1"} Oct 10 08:34:02 crc kubenswrapper[4822]: I1010 08:34:02.420703 4822 scope.go:117] "RemoveContainer" containerID="256f45c7ea0d8da7f0201183c607d7f15e72b8de5f8d9482c782fb16d58eaeb7" Oct 10 08:34:02 crc kubenswrapper[4822]: I1010 08:34:02.421673 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:34:02 crc kubenswrapper[4822]: E1010 08:34:02.422291 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:34:11 crc kubenswrapper[4822]: I1010 08:34:11.564615 4822 generic.go:334] "Generic (PLEG): container finished" podID="9008d029-8eb9-482b-bd9a-fd5bc2259bc1" containerID="b6e29b2683ec90cea097c5b17fea1201c4f7359495d51d6b4a7fa0a35a2be398" exitCode=0 Oct 10 08:34:11 crc kubenswrapper[4822]: I1010 08:34:11.564704 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" event={"ID":"9008d029-8eb9-482b-bd9a-fd5bc2259bc1","Type":"ContainerDied","Data":"b6e29b2683ec90cea097c5b17fea1201c4f7359495d51d6b4a7fa0a35a2be398"} Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.155127 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.219701 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-inventory\") pod \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.220251 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovn-combined-ca-bundle\") pod \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.220301 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovncontroller-config-0\") pod \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.220374 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cppq8\" (UniqueName: \"kubernetes.io/projected/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-kube-api-access-cppq8\") pod \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.220500 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ceph\") pod \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.220535 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ssh-key\") pod \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\" (UID: \"9008d029-8eb9-482b-bd9a-fd5bc2259bc1\") " Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.230564 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ceph" (OuterVolumeSpecName: "ceph") pod "9008d029-8eb9-482b-bd9a-fd5bc2259bc1" (UID: "9008d029-8eb9-482b-bd9a-fd5bc2259bc1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.231149 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-kube-api-access-cppq8" (OuterVolumeSpecName: "kube-api-access-cppq8") pod "9008d029-8eb9-482b-bd9a-fd5bc2259bc1" (UID: "9008d029-8eb9-482b-bd9a-fd5bc2259bc1"). InnerVolumeSpecName "kube-api-access-cppq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.234045 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9008d029-8eb9-482b-bd9a-fd5bc2259bc1" (UID: "9008d029-8eb9-482b-bd9a-fd5bc2259bc1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.260008 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9008d029-8eb9-482b-bd9a-fd5bc2259bc1" (UID: "9008d029-8eb9-482b-bd9a-fd5bc2259bc1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.262158 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9008d029-8eb9-482b-bd9a-fd5bc2259bc1" (UID: "9008d029-8eb9-482b-bd9a-fd5bc2259bc1"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.263869 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-inventory" (OuterVolumeSpecName: "inventory") pod "9008d029-8eb9-482b-bd9a-fd5bc2259bc1" (UID: "9008d029-8eb9-482b-bd9a-fd5bc2259bc1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.323501 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.323535 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.323568 4822 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.323579 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cppq8\" (UniqueName: \"kubernetes.io/projected/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-kube-api-access-cppq8\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.323587 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.323596 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9008d029-8eb9-482b-bd9a-fd5bc2259bc1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.595166 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" event={"ID":"9008d029-8eb9-482b-bd9a-fd5bc2259bc1","Type":"ContainerDied","Data":"844ffd93ea815c0181d5bdab985a97abea1e0fd061331f34dc0ea24a1413ecdb"} Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.595204 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844ffd93ea815c0181d5bdab985a97abea1e0fd061331f34dc0ea24a1413ecdb" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.595263 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-m7m8g" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.741763 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-lr864"] Oct 10 08:34:13 crc kubenswrapper[4822]: E1010 08:34:13.742316 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9008d029-8eb9-482b-bd9a-fd5bc2259bc1" containerName="ovn-openstack-openstack-cell1" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.742336 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9008d029-8eb9-482b-bd9a-fd5bc2259bc1" containerName="ovn-openstack-openstack-cell1" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.742565 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9008d029-8eb9-482b-bd9a-fd5bc2259bc1" containerName="ovn-openstack-openstack-cell1" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.743474 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-lr864"] Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.743607 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.745716 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.746083 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.746098 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.746741 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.747603 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.748764 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837290 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837477 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837561 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837630 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837651 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrwm\" (UniqueName: \"kubernetes.io/projected/06e3062b-6e37-4323-97a4-03c9dda66887-kube-api-access-qsrwm\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837725 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.837778 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.938828 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.938893 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.938957 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.939027 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.939113 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.939148 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.939167 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrwm\" (UniqueName: \"kubernetes.io/projected/06e3062b-6e37-4323-97a4-03c9dda66887-kube-api-access-qsrwm\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.943026 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.943347 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.943687 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.945109 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.946112 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.956679 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:13 crc kubenswrapper[4822]: I1010 08:34:13.969177 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrwm\" (UniqueName: \"kubernetes.io/projected/06e3062b-6e37-4323-97a4-03c9dda66887-kube-api-access-qsrwm\") pod \"neutron-metadata-openstack-openstack-cell1-lr864\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:14 crc kubenswrapper[4822]: I1010 08:34:14.066518 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:34:14 crc kubenswrapper[4822]: I1010 08:34:14.696696 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-lr864"] Oct 10 08:34:15 crc kubenswrapper[4822]: I1010 08:34:15.626589 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" event={"ID":"06e3062b-6e37-4323-97a4-03c9dda66887","Type":"ContainerStarted","Data":"17e71dc738fa8e404cf80eb0f10191aef946eae176b5ef46678500ad6472235e"} Oct 10 08:34:15 crc kubenswrapper[4822]: I1010 08:34:15.626965 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" event={"ID":"06e3062b-6e37-4323-97a4-03c9dda66887","Type":"ContainerStarted","Data":"f43df193ac29cb33ff1f4216d09838c499c02469274d0df23b0aafa1393ab71c"} Oct 10 08:34:15 crc kubenswrapper[4822]: I1010 08:34:15.641221 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" podStartSLOduration=2.114975668 podStartE2EDuration="2.641198763s" podCreationTimestamp="2025-10-10 08:34:13 +0000 UTC" firstStartedPulling="2025-10-10 08:34:14.704821141 +0000 UTC m=+7801.799979337" lastFinishedPulling="2025-10-10 08:34:15.231044236 +0000 UTC m=+7802.326202432" observedRunningTime="2025-10-10 08:34:15.640144433 +0000 UTC m=+7802.735302639" watchObservedRunningTime="2025-10-10 08:34:15.641198763 +0000 UTC m=+7802.736356969" Oct 10 08:34:17 crc kubenswrapper[4822]: I1010 08:34:17.650442 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:34:17 crc kubenswrapper[4822]: E1010 08:34:17.651392 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:34:28 crc kubenswrapper[4822]: I1010 08:34:28.651213 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:34:28 crc kubenswrapper[4822]: E1010 08:34:28.651946 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:34:41 crc kubenswrapper[4822]: I1010 08:34:41.650978 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:34:41 crc kubenswrapper[4822]: E1010 08:34:41.651874 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:34:55 crc kubenswrapper[4822]: I1010 08:34:55.650555 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:34:55 crc kubenswrapper[4822]: E1010 08:34:55.651433 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:35:10 crc kubenswrapper[4822]: I1010 08:35:10.215884 4822 generic.go:334] "Generic (PLEG): container finished" podID="06e3062b-6e37-4323-97a4-03c9dda66887" containerID="17e71dc738fa8e404cf80eb0f10191aef946eae176b5ef46678500ad6472235e" exitCode=0 Oct 10 08:35:10 crc kubenswrapper[4822]: I1010 08:35:10.215985 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" event={"ID":"06e3062b-6e37-4323-97a4-03c9dda66887","Type":"ContainerDied","Data":"17e71dc738fa8e404cf80eb0f10191aef946eae176b5ef46678500ad6472235e"} Oct 10 08:35:10 crc kubenswrapper[4822]: I1010 08:35:10.652312 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:35:10 crc kubenswrapper[4822]: E1010 08:35:10.653246 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.710589 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.822705 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ssh-key\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.822767 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ceph\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.822872 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-nova-metadata-neutron-config-0\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.823296 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrwm\" (UniqueName: \"kubernetes.io/projected/06e3062b-6e37-4323-97a4-03c9dda66887-kube-api-access-qsrwm\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.823383 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-inventory\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.823537 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-ovn-metadata-agent-neutron-config-0\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.823585 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-metadata-combined-ca-bundle\") pod \"06e3062b-6e37-4323-97a4-03c9dda66887\" (UID: \"06e3062b-6e37-4323-97a4-03c9dda66887\") " Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.829059 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ceph" (OuterVolumeSpecName: "ceph") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.829326 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e3062b-6e37-4323-97a4-03c9dda66887-kube-api-access-qsrwm" (OuterVolumeSpecName: "kube-api-access-qsrwm") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "kube-api-access-qsrwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.829463 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.857355 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.863815 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.864948 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-inventory" (OuterVolumeSpecName: "inventory") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.865516 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "06e3062b-6e37-4323-97a4-03c9dda66887" (UID: "06e3062b-6e37-4323-97a4-03c9dda66887"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928183 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrwm\" (UniqueName: \"kubernetes.io/projected/06e3062b-6e37-4323-97a4-03c9dda66887-kube-api-access-qsrwm\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928231 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928241 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928253 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928263 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928271 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:11 crc kubenswrapper[4822]: I1010 08:35:11.928279 4822 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/06e3062b-6e37-4323-97a4-03c9dda66887-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.237648 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" event={"ID":"06e3062b-6e37-4323-97a4-03c9dda66887","Type":"ContainerDied","Data":"f43df193ac29cb33ff1f4216d09838c499c02469274d0df23b0aafa1393ab71c"} Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.238006 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43df193ac29cb33ff1f4216d09838c499c02469274d0df23b0aafa1393ab71c" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.237762 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-lr864" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.333854 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9slgf"] Oct 10 08:35:12 crc kubenswrapper[4822]: E1010 08:35:12.334470 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e3062b-6e37-4323-97a4-03c9dda66887" containerName="neutron-metadata-openstack-openstack-cell1" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.334490 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e3062b-6e37-4323-97a4-03c9dda66887" containerName="neutron-metadata-openstack-openstack-cell1" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.334866 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e3062b-6e37-4323-97a4-03c9dda66887" containerName="neutron-metadata-openstack-openstack-cell1" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.336276 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.348255 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9slgf"] Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.385049 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.385122 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.385048 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.385310 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.386632 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.439262 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pftdm\" (UniqueName: \"kubernetes.io/projected/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-kube-api-access-pftdm\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.439316 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ssh-key\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.439387 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.439421 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-inventory\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.439505 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.439535 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ceph\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.541276 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pftdm\" (UniqueName: \"kubernetes.io/projected/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-kube-api-access-pftdm\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.541339 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ssh-key\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.541438 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.541488 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-inventory\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.541615 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.541667 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ceph\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.547906 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ssh-key\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.548220 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.549076 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-inventory\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.552463 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.554075 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ceph\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.563837 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pftdm\" (UniqueName: \"kubernetes.io/projected/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-kube-api-access-pftdm\") pod \"libvirt-openstack-openstack-cell1-9slgf\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:12 crc kubenswrapper[4822]: I1010 08:35:12.708119 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:35:13 crc kubenswrapper[4822]: I1010 08:35:13.273022 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9slgf"] Oct 10 08:35:13 crc kubenswrapper[4822]: I1010 08:35:13.277068 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:35:13 crc kubenswrapper[4822]: I1010 08:35:13.715174 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:35:14 crc kubenswrapper[4822]: I1010 08:35:14.258849 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" event={"ID":"d85b9ab1-c113-48ef-9875-ba3ebf6427f9","Type":"ContainerStarted","Data":"935e0c812b0a6be71a60a7e98de6256810f79104cdb8f32aa520a5b3890a9df8"} Oct 10 08:35:14 crc kubenswrapper[4822]: I1010 08:35:14.259205 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" event={"ID":"d85b9ab1-c113-48ef-9875-ba3ebf6427f9","Type":"ContainerStarted","Data":"dfa6e04545f6d10e2456560fe51fd50205702fc9eb5593f7314914997daa0115"} Oct 10 08:35:14 crc kubenswrapper[4822]: I1010 08:35:14.278965 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" podStartSLOduration=1.8432440570000002 podStartE2EDuration="2.278945419s" podCreationTimestamp="2025-10-10 08:35:12 +0000 UTC" firstStartedPulling="2025-10-10 08:35:13.276798025 +0000 UTC m=+7860.371956211" lastFinishedPulling="2025-10-10 08:35:13.712499377 +0000 UTC m=+7860.807657573" observedRunningTime="2025-10-10 08:35:14.27304318 +0000 UTC m=+7861.368201366" watchObservedRunningTime="2025-10-10 08:35:14.278945419 +0000 UTC m=+7861.374103615" Oct 10 08:35:22 crc kubenswrapper[4822]: I1010 08:35:22.650412 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:35:22 crc kubenswrapper[4822]: E1010 08:35:22.651382 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.598760 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qkhhx"] Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.601948 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.614069 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkhhx"] Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.708280 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66dk\" (UniqueName: \"kubernetes.io/projected/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-kube-api-access-l66dk\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.708794 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-utilities\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.708950 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-catalog-content\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.811507 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l66dk\" (UniqueName: \"kubernetes.io/projected/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-kube-api-access-l66dk\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.811649 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-utilities\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.811681 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-catalog-content\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.812228 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-catalog-content\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.812309 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-utilities\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.831876 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66dk\" (UniqueName: \"kubernetes.io/projected/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-kube-api-access-l66dk\") pod \"redhat-marketplace-qkhhx\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:27 crc kubenswrapper[4822]: I1010 08:35:27.925181 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:28 crc kubenswrapper[4822]: I1010 08:35:28.392409 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkhhx"] Oct 10 08:35:28 crc kubenswrapper[4822]: W1010 08:35:28.398065 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod521abdc6_f3ce_47f2_8541_78a2ee5c3bcd.slice/crio-aafadb6363e1bd790ed93734c07aa28dfbb439f3d3dabf56a4d375184f9acb1d WatchSource:0}: Error finding container aafadb6363e1bd790ed93734c07aa28dfbb439f3d3dabf56a4d375184f9acb1d: Status 404 returned error can't find the container with id aafadb6363e1bd790ed93734c07aa28dfbb439f3d3dabf56a4d375184f9acb1d Oct 10 08:35:29 crc kubenswrapper[4822]: I1010 08:35:29.408902 4822 generic.go:334] "Generic (PLEG): container finished" podID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerID="0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3" exitCode=0 Oct 10 08:35:29 crc kubenswrapper[4822]: I1010 08:35:29.408961 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerDied","Data":"0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3"} Oct 10 08:35:29 crc kubenswrapper[4822]: I1010 08:35:29.409187 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerStarted","Data":"aafadb6363e1bd790ed93734c07aa28dfbb439f3d3dabf56a4d375184f9acb1d"} Oct 10 08:35:30 crc kubenswrapper[4822]: I1010 08:35:30.421674 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerStarted","Data":"c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b"} Oct 10 08:35:31 crc kubenswrapper[4822]: I1010 08:35:31.437343 4822 generic.go:334] "Generic (PLEG): container finished" podID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerID="c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b" exitCode=0 Oct 10 08:35:31 crc kubenswrapper[4822]: I1010 08:35:31.437392 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerDied","Data":"c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b"} Oct 10 08:35:32 crc kubenswrapper[4822]: I1010 08:35:32.448109 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerStarted","Data":"e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c"} Oct 10 08:35:32 crc kubenswrapper[4822]: I1010 08:35:32.472390 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qkhhx" podStartSLOduration=2.8744124920000003 podStartE2EDuration="5.472372335s" podCreationTimestamp="2025-10-10 08:35:27 +0000 UTC" firstStartedPulling="2025-10-10 08:35:29.411714225 +0000 UTC m=+7876.506872411" lastFinishedPulling="2025-10-10 08:35:32.009674028 +0000 UTC m=+7879.104832254" observedRunningTime="2025-10-10 08:35:32.466585039 +0000 UTC m=+7879.561743255" watchObservedRunningTime="2025-10-10 08:35:32.472372335 +0000 UTC m=+7879.567530551" Oct 10 08:35:35 crc kubenswrapper[4822]: I1010 08:35:35.651041 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:35:35 crc kubenswrapper[4822]: E1010 08:35:35.651642 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:35:37 crc kubenswrapper[4822]: I1010 08:35:37.925485 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:37 crc kubenswrapper[4822]: I1010 08:35:37.926131 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:37 crc kubenswrapper[4822]: I1010 08:35:37.972043 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:38 crc kubenswrapper[4822]: I1010 08:35:38.558370 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:38 crc kubenswrapper[4822]: I1010 08:35:38.719035 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkhhx"] Oct 10 08:35:40 crc kubenswrapper[4822]: I1010 08:35:40.536299 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qkhhx" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="registry-server" containerID="cri-o://e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c" gracePeriod=2 Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.097698 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.152713 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-catalog-content\") pod \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.154192 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l66dk\" (UniqueName: \"kubernetes.io/projected/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-kube-api-access-l66dk\") pod \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.154398 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-utilities\") pod \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\" (UID: \"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd\") " Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.155418 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-utilities" (OuterVolumeSpecName: "utilities") pod "521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" (UID: "521abdc6-f3ce-47f2-8541-78a2ee5c3bcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.160609 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-kube-api-access-l66dk" (OuterVolumeSpecName: "kube-api-access-l66dk") pod "521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" (UID: "521abdc6-f3ce-47f2-8541-78a2ee5c3bcd"). InnerVolumeSpecName "kube-api-access-l66dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.168840 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" (UID: "521abdc6-f3ce-47f2-8541-78a2ee5c3bcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.257674 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.258015 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l66dk\" (UniqueName: \"kubernetes.io/projected/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-kube-api-access-l66dk\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.258028 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.547491 4822 generic.go:334] "Generic (PLEG): container finished" podID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerID="e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c" exitCode=0 Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.547533 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerDied","Data":"e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c"} Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.547568 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkhhx" event={"ID":"521abdc6-f3ce-47f2-8541-78a2ee5c3bcd","Type":"ContainerDied","Data":"aafadb6363e1bd790ed93734c07aa28dfbb439f3d3dabf56a4d375184f9acb1d"} Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.547590 4822 scope.go:117] "RemoveContainer" containerID="e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.547592 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkhhx" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.573855 4822 scope.go:117] "RemoveContainer" containerID="c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.596357 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkhhx"] Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.599601 4822 scope.go:117] "RemoveContainer" containerID="0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.607424 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkhhx"] Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.668250 4822 scope.go:117] "RemoveContainer" containerID="e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c" Oct 10 08:35:41 crc kubenswrapper[4822]: E1010 08:35:41.668527 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c\": container with ID starting with e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c not found: ID does not exist" containerID="e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.668585 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c"} err="failed to get container status \"e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c\": rpc error: code = NotFound desc = could not find container \"e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c\": container with ID starting with e260a26152071cd29004ebc7d55e04a2aa53e5cfb5651f13cb068b3b06492c1c not found: ID does not exist" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.668610 4822 scope.go:117] "RemoveContainer" containerID="c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b" Oct 10 08:35:41 crc kubenswrapper[4822]: E1010 08:35:41.669719 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b\": container with ID starting with c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b not found: ID does not exist" containerID="c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.669813 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b"} err="failed to get container status \"c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b\": rpc error: code = NotFound desc = could not find container \"c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b\": container with ID starting with c4681bdad3932875356cae98b1753f3054b4bcbc691b2fd838ed0c4993b4089b not found: ID does not exist" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.669841 4822 scope.go:117] "RemoveContainer" containerID="0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3" Oct 10 08:35:41 crc kubenswrapper[4822]: E1010 08:35:41.671283 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3\": container with ID starting with 0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3 not found: ID does not exist" containerID="0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.671314 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3"} err="failed to get container status \"0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3\": rpc error: code = NotFound desc = could not find container \"0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3\": container with ID starting with 0e168a19660d977e51ead2d934fa225f1ede7dc2ffab51db7a96217dd66efdd3 not found: ID does not exist" Oct 10 08:35:41 crc kubenswrapper[4822]: I1010 08:35:41.674763 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" path="/var/lib/kubelet/pods/521abdc6-f3ce-47f2-8541-78a2ee5c3bcd/volumes" Oct 10 08:35:50 crc kubenswrapper[4822]: I1010 08:35:50.651487 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:35:50 crc kubenswrapper[4822]: E1010 08:35:50.652309 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:36:04 crc kubenswrapper[4822]: I1010 08:36:04.653684 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:36:04 crc kubenswrapper[4822]: E1010 08:36:04.654847 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:36:16 crc kubenswrapper[4822]: I1010 08:36:16.651944 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:36:16 crc kubenswrapper[4822]: E1010 08:36:16.653223 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:36:27 crc kubenswrapper[4822]: I1010 08:36:27.651302 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:36:27 crc kubenswrapper[4822]: E1010 08:36:27.652193 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:36:39 crc kubenswrapper[4822]: I1010 08:36:39.650051 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:36:39 crc kubenswrapper[4822]: E1010 08:36:39.651005 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:36:50 crc kubenswrapper[4822]: I1010 08:36:50.650467 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:36:50 crc kubenswrapper[4822]: E1010 08:36:50.651293 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:37:01 crc kubenswrapper[4822]: I1010 08:37:01.651825 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:37:01 crc kubenswrapper[4822]: E1010 08:37:01.653782 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:37:16 crc kubenswrapper[4822]: I1010 08:37:16.652176 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:37:16 crc kubenswrapper[4822]: E1010 08:37:16.653321 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:37:27 crc kubenswrapper[4822]: I1010 08:37:27.651312 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:37:27 crc kubenswrapper[4822]: E1010 08:37:27.652777 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:37:38 crc kubenswrapper[4822]: I1010 08:37:38.651022 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:37:38 crc kubenswrapper[4822]: E1010 08:37:38.652184 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:37:53 crc kubenswrapper[4822]: I1010 08:37:53.676542 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:37:53 crc kubenswrapper[4822]: E1010 08:37:53.677660 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:38:06 crc kubenswrapper[4822]: I1010 08:38:06.651554 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:38:06 crc kubenswrapper[4822]: E1010 08:38:06.652928 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:38:21 crc kubenswrapper[4822]: I1010 08:38:21.650691 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:38:21 crc kubenswrapper[4822]: E1010 08:38:21.651831 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:38:33 crc kubenswrapper[4822]: I1010 08:38:33.659799 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:38:33 crc kubenswrapper[4822]: E1010 08:38:33.660765 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:38:46 crc kubenswrapper[4822]: I1010 08:38:46.656333 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:38:46 crc kubenswrapper[4822]: E1010 08:38:46.657552 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:39:00 crc kubenswrapper[4822]: I1010 08:39:00.650519 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:39:00 crc kubenswrapper[4822]: E1010 08:39:00.652426 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:39:13 crc kubenswrapper[4822]: I1010 08:39:13.650590 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:39:13 crc kubenswrapper[4822]: I1010 08:39:13.945288 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4"} Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.273908 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2z8qd"] Oct 10 08:39:49 crc kubenswrapper[4822]: E1010 08:39:49.275014 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="extract-content" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.275030 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="extract-content" Oct 10 08:39:49 crc kubenswrapper[4822]: E1010 08:39:49.275068 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="extract-utilities" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.275075 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="extract-utilities" Oct 10 08:39:49 crc kubenswrapper[4822]: E1010 08:39:49.275101 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="registry-server" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.275107 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="registry-server" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.275295 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="521abdc6-f3ce-47f2-8541-78a2ee5c3bcd" containerName="registry-server" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.277647 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.286519 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z8qd"] Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.349945 4822 generic.go:334] "Generic (PLEG): container finished" podID="d85b9ab1-c113-48ef-9875-ba3ebf6427f9" containerID="935e0c812b0a6be71a60a7e98de6256810f79104cdb8f32aa520a5b3890a9df8" exitCode=0 Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.350054 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" event={"ID":"d85b9ab1-c113-48ef-9875-ba3ebf6427f9","Type":"ContainerDied","Data":"935e0c812b0a6be71a60a7e98de6256810f79104cdb8f32aa520a5b3890a9df8"} Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.434921 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wwnp\" (UniqueName: \"kubernetes.io/projected/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-kube-api-access-4wwnp\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.435435 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-catalog-content\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.435462 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-utilities\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.537342 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-utilities\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.537530 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wwnp\" (UniqueName: \"kubernetes.io/projected/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-kube-api-access-4wwnp\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.537644 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-catalog-content\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.538146 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-utilities\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.538186 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-catalog-content\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.575280 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wwnp\" (UniqueName: \"kubernetes.io/projected/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-kube-api-access-4wwnp\") pod \"certified-operators-2z8qd\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:49 crc kubenswrapper[4822]: I1010 08:39:49.600699 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.183366 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2z8qd"] Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.362169 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z8qd" event={"ID":"1e76f28c-cac7-4071-bab7-28e6e8b5ab97","Type":"ContainerStarted","Data":"e677e778288b677479a2f35afece28fed75d4c66f78ce820277b56d692d9f9a0"} Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.878306 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.978758 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ssh-key\") pod \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.979392 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-secret-0\") pod \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.979595 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-combined-ca-bundle\") pod \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.979667 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-inventory\") pod \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.979746 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pftdm\" (UniqueName: \"kubernetes.io/projected/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-kube-api-access-pftdm\") pod \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.979977 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ceph\") pod \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\" (UID: \"d85b9ab1-c113-48ef-9875-ba3ebf6427f9\") " Oct 10 08:39:50 crc kubenswrapper[4822]: I1010 08:39:50.990245 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ceph" (OuterVolumeSpecName: "ceph") pod "d85b9ab1-c113-48ef-9875-ba3ebf6427f9" (UID: "d85b9ab1-c113-48ef-9875-ba3ebf6427f9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.020103 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-kube-api-access-pftdm" (OuterVolumeSpecName: "kube-api-access-pftdm") pod "d85b9ab1-c113-48ef-9875-ba3ebf6427f9" (UID: "d85b9ab1-c113-48ef-9875-ba3ebf6427f9"). InnerVolumeSpecName "kube-api-access-pftdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.039970 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d85b9ab1-c113-48ef-9875-ba3ebf6427f9" (UID: "d85b9ab1-c113-48ef-9875-ba3ebf6427f9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.064953 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d85b9ab1-c113-48ef-9875-ba3ebf6427f9" (UID: "d85b9ab1-c113-48ef-9875-ba3ebf6427f9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.074967 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-inventory" (OuterVolumeSpecName: "inventory") pod "d85b9ab1-c113-48ef-9875-ba3ebf6427f9" (UID: "d85b9ab1-c113-48ef-9875-ba3ebf6427f9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.082952 4822 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.082986 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.082997 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pftdm\" (UniqueName: \"kubernetes.io/projected/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-kube-api-access-pftdm\") on node \"crc\" DevicePath \"\"" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.083005 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.083013 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.083394 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d85b9ab1-c113-48ef-9875-ba3ebf6427f9" (UID: "d85b9ab1-c113-48ef-9875-ba3ebf6427f9"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.184431 4822 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d85b9ab1-c113-48ef-9875-ba3ebf6427f9-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.374562 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" event={"ID":"d85b9ab1-c113-48ef-9875-ba3ebf6427f9","Type":"ContainerDied","Data":"dfa6e04545f6d10e2456560fe51fd50205702fc9eb5593f7314914997daa0115"} Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.374708 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa6e04545f6d10e2456560fe51fd50205702fc9eb5593f7314914997daa0115" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.374888 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9slgf" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.377209 4822 generic.go:334] "Generic (PLEG): container finished" podID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerID="0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53" exitCode=0 Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.377257 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z8qd" event={"ID":"1e76f28c-cac7-4071-bab7-28e6e8b5ab97","Type":"ContainerDied","Data":"0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53"} Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.474688 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-frjzw"] Oct 10 08:39:51 crc kubenswrapper[4822]: E1010 08:39:51.475339 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85b9ab1-c113-48ef-9875-ba3ebf6427f9" containerName="libvirt-openstack-openstack-cell1" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.475367 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85b9ab1-c113-48ef-9875-ba3ebf6427f9" containerName="libvirt-openstack-openstack-cell1" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.475660 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85b9ab1-c113-48ef-9875-ba3ebf6427f9" containerName="libvirt-openstack-openstack-cell1" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.476860 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.480396 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.480669 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.480870 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.481047 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.481261 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.481452 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.481639 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.491411 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-frjzw"] Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.592570 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.592681 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.592722 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.592742 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.592938 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjgs\" (UniqueName: \"kubernetes.io/projected/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-kube-api-access-gfjgs\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.593029 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.593058 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.593125 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-inventory\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.593215 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ceph\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.593271 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.593463 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695174 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjgs\" (UniqueName: \"kubernetes.io/projected/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-kube-api-access-gfjgs\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695671 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695714 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695775 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-inventory\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695863 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ceph\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695893 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695955 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.695975 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.696016 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.696044 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.696061 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.697679 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.698118 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.702346 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ceph\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.702385 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.702654 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.703063 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-inventory\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.703167 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.703208 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.703370 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.703628 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.722694 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjgs\" (UniqueName: \"kubernetes.io/projected/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-kube-api-access-gfjgs\") pod \"nova-cell1-openstack-openstack-cell1-frjzw\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:51 crc kubenswrapper[4822]: I1010 08:39:51.804398 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:39:52 crc kubenswrapper[4822]: I1010 08:39:52.400629 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-frjzw"] Oct 10 08:39:53 crc kubenswrapper[4822]: I1010 08:39:53.404786 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" event={"ID":"afd1eb20-744e-41a9-84b0-0dfb89dc1cea","Type":"ContainerStarted","Data":"43e9ca65388e688d763be6d0c260e0d7087ea12b267f6095c1606f51b01fa584"} Oct 10 08:39:53 crc kubenswrapper[4822]: I1010 08:39:53.405223 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" event={"ID":"afd1eb20-744e-41a9-84b0-0dfb89dc1cea","Type":"ContainerStarted","Data":"6f73c4d70f80a8bc2f29108b944a027153c5a26444f7c3a8c8cbb4044154bac7"} Oct 10 08:39:53 crc kubenswrapper[4822]: I1010 08:39:53.409512 4822 generic.go:334] "Generic (PLEG): container finished" podID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerID="a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9" exitCode=0 Oct 10 08:39:53 crc kubenswrapper[4822]: I1010 08:39:53.409566 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z8qd" event={"ID":"1e76f28c-cac7-4071-bab7-28e6e8b5ab97","Type":"ContainerDied","Data":"a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9"} Oct 10 08:39:53 crc kubenswrapper[4822]: I1010 08:39:53.442622 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" podStartSLOduration=1.9010760370000002 podStartE2EDuration="2.442559243s" podCreationTimestamp="2025-10-10 08:39:51 +0000 UTC" firstStartedPulling="2025-10-10 08:39:52.413728025 +0000 UTC m=+8139.508886221" lastFinishedPulling="2025-10-10 08:39:52.955211191 +0000 UTC m=+8140.050369427" observedRunningTime="2025-10-10 08:39:53.429145169 +0000 UTC m=+8140.524303395" watchObservedRunningTime="2025-10-10 08:39:53.442559243 +0000 UTC m=+8140.537717439" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.080757 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vmkkt"] Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.083467 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.091727 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vmkkt"] Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.149407 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62v7\" (UniqueName: \"kubernetes.io/projected/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-kube-api-access-f62v7\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.149758 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-utilities\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.149854 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-catalog-content\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.251960 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-utilities\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.252030 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-catalog-content\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.252098 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62v7\" (UniqueName: \"kubernetes.io/projected/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-kube-api-access-f62v7\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.252660 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-catalog-content\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.252674 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-utilities\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.273945 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62v7\" (UniqueName: \"kubernetes.io/projected/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-kube-api-access-f62v7\") pod \"redhat-operators-vmkkt\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.431734 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:39:54 crc kubenswrapper[4822]: I1010 08:39:54.977170 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vmkkt"] Oct 10 08:39:55 crc kubenswrapper[4822]: I1010 08:39:55.438827 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z8qd" event={"ID":"1e76f28c-cac7-4071-bab7-28e6e8b5ab97","Type":"ContainerStarted","Data":"b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b"} Oct 10 08:39:55 crc kubenswrapper[4822]: I1010 08:39:55.447291 4822 generic.go:334] "Generic (PLEG): container finished" podID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerID="2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39" exitCode=0 Oct 10 08:39:55 crc kubenswrapper[4822]: I1010 08:39:55.447343 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerDied","Data":"2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39"} Oct 10 08:39:55 crc kubenswrapper[4822]: I1010 08:39:55.447372 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerStarted","Data":"9c476dc913fbcbdf986c5719011c4f91a80678ba408e95c1fdf5a95eefa4f8a3"} Oct 10 08:39:55 crc kubenswrapper[4822]: I1010 08:39:55.469787 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2z8qd" podStartSLOduration=3.609541453 podStartE2EDuration="6.469767366s" podCreationTimestamp="2025-10-10 08:39:49 +0000 UTC" firstStartedPulling="2025-10-10 08:39:51.379142491 +0000 UTC m=+8138.474300687" lastFinishedPulling="2025-10-10 08:39:54.239368394 +0000 UTC m=+8141.334526600" observedRunningTime="2025-10-10 08:39:55.464479534 +0000 UTC m=+8142.559637750" watchObservedRunningTime="2025-10-10 08:39:55.469767366 +0000 UTC m=+8142.564925562" Oct 10 08:39:57 crc kubenswrapper[4822]: I1010 08:39:57.471254 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerStarted","Data":"f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086"} Oct 10 08:39:59 crc kubenswrapper[4822]: I1010 08:39:59.601406 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:59 crc kubenswrapper[4822]: I1010 08:39:59.602799 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:39:59 crc kubenswrapper[4822]: I1010 08:39:59.665009 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:40:00 crc kubenswrapper[4822]: I1010 08:40:00.517382 4822 generic.go:334] "Generic (PLEG): container finished" podID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerID="f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086" exitCode=0 Oct 10 08:40:00 crc kubenswrapper[4822]: I1010 08:40:00.517461 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerDied","Data":"f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086"} Oct 10 08:40:00 crc kubenswrapper[4822]: I1010 08:40:00.567076 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:40:01 crc kubenswrapper[4822]: I1010 08:40:01.528858 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerStarted","Data":"7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c"} Oct 10 08:40:01 crc kubenswrapper[4822]: I1010 08:40:01.561994 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vmkkt" podStartSLOduration=2.048198922 podStartE2EDuration="7.561967228s" podCreationTimestamp="2025-10-10 08:39:54 +0000 UTC" firstStartedPulling="2025-10-10 08:39:55.452939533 +0000 UTC m=+8142.548097719" lastFinishedPulling="2025-10-10 08:40:00.966707829 +0000 UTC m=+8148.061866025" observedRunningTime="2025-10-10 08:40:01.560205088 +0000 UTC m=+8148.655363294" watchObservedRunningTime="2025-10-10 08:40:01.561967228 +0000 UTC m=+8148.657125474" Oct 10 08:40:02 crc kubenswrapper[4822]: I1010 08:40:02.465194 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z8qd"] Oct 10 08:40:03 crc kubenswrapper[4822]: I1010 08:40:03.546206 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2z8qd" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="registry-server" containerID="cri-o://b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b" gracePeriod=2 Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.055889 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.107932 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-utilities\") pod \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.108103 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wwnp\" (UniqueName: \"kubernetes.io/projected/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-kube-api-access-4wwnp\") pod \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.108323 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-catalog-content\") pod \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\" (UID: \"1e76f28c-cac7-4071-bab7-28e6e8b5ab97\") " Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.108776 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-utilities" (OuterVolumeSpecName: "utilities") pod "1e76f28c-cac7-4071-bab7-28e6e8b5ab97" (UID: "1e76f28c-cac7-4071-bab7-28e6e8b5ab97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.113145 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.116647 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-kube-api-access-4wwnp" (OuterVolumeSpecName: "kube-api-access-4wwnp") pod "1e76f28c-cac7-4071-bab7-28e6e8b5ab97" (UID: "1e76f28c-cac7-4071-bab7-28e6e8b5ab97"). InnerVolumeSpecName "kube-api-access-4wwnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.155996 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e76f28c-cac7-4071-bab7-28e6e8b5ab97" (UID: "1e76f28c-cac7-4071-bab7-28e6e8b5ab97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.214747 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wwnp\" (UniqueName: \"kubernetes.io/projected/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-kube-api-access-4wwnp\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.214782 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e76f28c-cac7-4071-bab7-28e6e8b5ab97-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.432358 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.432785 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.558617 4822 generic.go:334] "Generic (PLEG): container finished" podID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerID="b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b" exitCode=0 Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.558658 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z8qd" event={"ID":"1e76f28c-cac7-4071-bab7-28e6e8b5ab97","Type":"ContainerDied","Data":"b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b"} Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.558683 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2z8qd" event={"ID":"1e76f28c-cac7-4071-bab7-28e6e8b5ab97","Type":"ContainerDied","Data":"e677e778288b677479a2f35afece28fed75d4c66f78ce820277b56d692d9f9a0"} Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.558699 4822 scope.go:117] "RemoveContainer" containerID="b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.558766 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2z8qd" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.581648 4822 scope.go:117] "RemoveContainer" containerID="a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.602556 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2z8qd"] Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.612828 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2z8qd"] Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.633858 4822 scope.go:117] "RemoveContainer" containerID="0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.684190 4822 scope.go:117] "RemoveContainer" containerID="b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b" Oct 10 08:40:04 crc kubenswrapper[4822]: E1010 08:40:04.684690 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b\": container with ID starting with b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b not found: ID does not exist" containerID="b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.684715 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b"} err="failed to get container status \"b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b\": rpc error: code = NotFound desc = could not find container \"b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b\": container with ID starting with b41cf7efdf2393e31f7edf70b7874531d6b51421691c558b6d023bdb55be1e4b not found: ID does not exist" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.684738 4822 scope.go:117] "RemoveContainer" containerID="a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9" Oct 10 08:40:04 crc kubenswrapper[4822]: E1010 08:40:04.684966 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9\": container with ID starting with a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9 not found: ID does not exist" containerID="a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.684986 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9"} err="failed to get container status \"a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9\": rpc error: code = NotFound desc = could not find container \"a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9\": container with ID starting with a53f74995c9380611f54edb67f7ec108c7e41cc78f118a17e56a19a456ad69a9 not found: ID does not exist" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.685001 4822 scope.go:117] "RemoveContainer" containerID="0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53" Oct 10 08:40:04 crc kubenswrapper[4822]: E1010 08:40:04.685201 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53\": container with ID starting with 0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53 not found: ID does not exist" containerID="0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53" Oct 10 08:40:04 crc kubenswrapper[4822]: I1010 08:40:04.685220 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53"} err="failed to get container status \"0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53\": rpc error: code = NotFound desc = could not find container \"0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53\": container with ID starting with 0a0b30c860ea76c592783c3f4e94672fffe4d3e3db1f16f020aebc9419fc0f53 not found: ID does not exist" Oct 10 08:40:05 crc kubenswrapper[4822]: I1010 08:40:05.501272 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vmkkt" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="registry-server" probeResult="failure" output=< Oct 10 08:40:05 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:40:05 crc kubenswrapper[4822]: > Oct 10 08:40:05 crc kubenswrapper[4822]: I1010 08:40:05.661815 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" path="/var/lib/kubelet/pods/1e76f28c-cac7-4071-bab7-28e6e8b5ab97/volumes" Oct 10 08:40:14 crc kubenswrapper[4822]: I1010 08:40:14.486425 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:40:14 crc kubenswrapper[4822]: I1010 08:40:14.536526 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:40:14 crc kubenswrapper[4822]: I1010 08:40:14.722759 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vmkkt"] Oct 10 08:40:15 crc kubenswrapper[4822]: I1010 08:40:15.690467 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vmkkt" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="registry-server" containerID="cri-o://7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c" gracePeriod=2 Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.202389 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.356471 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-utilities\") pod \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.356782 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-catalog-content\") pod \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.357010 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62v7\" (UniqueName: \"kubernetes.io/projected/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-kube-api-access-f62v7\") pod \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\" (UID: \"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a\") " Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.358078 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-utilities" (OuterVolumeSpecName: "utilities") pod "e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" (UID: "e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.371358 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-kube-api-access-f62v7" (OuterVolumeSpecName: "kube-api-access-f62v7") pod "e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" (UID: "e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a"). InnerVolumeSpecName "kube-api-access-f62v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.461754 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.461845 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62v7\" (UniqueName: \"kubernetes.io/projected/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-kube-api-access-f62v7\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.478564 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" (UID: "e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.564196 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.702872 4822 generic.go:334] "Generic (PLEG): container finished" podID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerID="7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c" exitCode=0 Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.702924 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerDied","Data":"7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c"} Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.702955 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vmkkt" event={"ID":"e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a","Type":"ContainerDied","Data":"9c476dc913fbcbdf986c5719011c4f91a80678ba408e95c1fdf5a95eefa4f8a3"} Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.702977 4822 scope.go:117] "RemoveContainer" containerID="7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.704087 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vmkkt" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.754234 4822 scope.go:117] "RemoveContainer" containerID="f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.765656 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vmkkt"] Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.792154 4822 scope.go:117] "RemoveContainer" containerID="2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.792681 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vmkkt"] Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.843489 4822 scope.go:117] "RemoveContainer" containerID="7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c" Oct 10 08:40:16 crc kubenswrapper[4822]: E1010 08:40:16.844306 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c\": container with ID starting with 7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c not found: ID does not exist" containerID="7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.844351 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c"} err="failed to get container status \"7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c\": rpc error: code = NotFound desc = could not find container \"7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c\": container with ID starting with 7cc15283a065fa58dcc90ec1fca14644e552b3aca6b53a556f3c6ebcbfae7f8c not found: ID does not exist" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.844432 4822 scope.go:117] "RemoveContainer" containerID="f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086" Oct 10 08:40:16 crc kubenswrapper[4822]: E1010 08:40:16.845150 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086\": container with ID starting with f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086 not found: ID does not exist" containerID="f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.845187 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086"} err="failed to get container status \"f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086\": rpc error: code = NotFound desc = could not find container \"f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086\": container with ID starting with f9cb0380001a36014b0124f36ad1ce08daa9b1be68b29a4a5b9f0d5290fd1086 not found: ID does not exist" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.845213 4822 scope.go:117] "RemoveContainer" containerID="2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39" Oct 10 08:40:16 crc kubenswrapper[4822]: E1010 08:40:16.845590 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39\": container with ID starting with 2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39 not found: ID does not exist" containerID="2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39" Oct 10 08:40:16 crc kubenswrapper[4822]: I1010 08:40:16.845649 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39"} err="failed to get container status \"2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39\": rpc error: code = NotFound desc = could not find container \"2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39\": container with ID starting with 2145c4a5809c3211b34325d542174a69c755cd3ff1ce4a6585ca04033e1aba39 not found: ID does not exist" Oct 10 08:40:17 crc kubenswrapper[4822]: I1010 08:40:17.663273 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" path="/var/lib/kubelet/pods/e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a/volumes" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.743008 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-th9k6"] Oct 10 08:40:26 crc kubenswrapper[4822]: E1010 08:40:26.744239 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="registry-server" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744256 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="registry-server" Oct 10 08:40:26 crc kubenswrapper[4822]: E1010 08:40:26.744288 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="registry-server" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744296 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="registry-server" Oct 10 08:40:26 crc kubenswrapper[4822]: E1010 08:40:26.744317 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="extract-content" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744326 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="extract-content" Oct 10 08:40:26 crc kubenswrapper[4822]: E1010 08:40:26.744337 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="extract-content" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744345 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="extract-content" Oct 10 08:40:26 crc kubenswrapper[4822]: E1010 08:40:26.744378 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="extract-utilities" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744386 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="extract-utilities" Oct 10 08:40:26 crc kubenswrapper[4822]: E1010 08:40:26.744407 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="extract-utilities" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744416 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="extract-utilities" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744656 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e76f28c-cac7-4071-bab7-28e6e8b5ab97" containerName="registry-server" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.744679 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e053c0b4-c2e8-4c88-ad6d-9accf00bfa7a" containerName="registry-server" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.747314 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.795686 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-th9k6"] Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.909817 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-catalog-content\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.910037 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-utilities\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:26 crc kubenswrapper[4822]: I1010 08:40:26.910201 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44tcl\" (UniqueName: \"kubernetes.io/projected/2010f4c4-1486-4c68-b242-f3d08891a078-kube-api-access-44tcl\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.011770 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-utilities\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.011903 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44tcl\" (UniqueName: \"kubernetes.io/projected/2010f4c4-1486-4c68-b242-f3d08891a078-kube-api-access-44tcl\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.011979 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-catalog-content\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.012325 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-catalog-content\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.012356 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-utilities\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.113183 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44tcl\" (UniqueName: \"kubernetes.io/projected/2010f4c4-1486-4c68-b242-f3d08891a078-kube-api-access-44tcl\") pod \"community-operators-th9k6\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:27 crc kubenswrapper[4822]: I1010 08:40:27.373343 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:28 crc kubenswrapper[4822]: I1010 08:40:28.008106 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-th9k6"] Oct 10 08:40:28 crc kubenswrapper[4822]: I1010 08:40:28.830069 4822 generic.go:334] "Generic (PLEG): container finished" podID="2010f4c4-1486-4c68-b242-f3d08891a078" containerID="3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55" exitCode=0 Oct 10 08:40:28 crc kubenswrapper[4822]: I1010 08:40:28.830185 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerDied","Data":"3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55"} Oct 10 08:40:28 crc kubenswrapper[4822]: I1010 08:40:28.830879 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerStarted","Data":"19bbeb5160993e1f057ddd430f74afa700c4dbc2f2009fc76db41e3401f0227c"} Oct 10 08:40:28 crc kubenswrapper[4822]: I1010 08:40:28.832259 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:40:30 crc kubenswrapper[4822]: I1010 08:40:30.852369 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerStarted","Data":"29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b"} Oct 10 08:40:31 crc kubenswrapper[4822]: I1010 08:40:31.868342 4822 generic.go:334] "Generic (PLEG): container finished" podID="2010f4c4-1486-4c68-b242-f3d08891a078" containerID="29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b" exitCode=0 Oct 10 08:40:31 crc kubenswrapper[4822]: I1010 08:40:31.868442 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerDied","Data":"29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b"} Oct 10 08:40:32 crc kubenswrapper[4822]: I1010 08:40:32.880729 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerStarted","Data":"625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de"} Oct 10 08:40:32 crc kubenswrapper[4822]: I1010 08:40:32.897926 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-th9k6" podStartSLOduration=3.405680538 podStartE2EDuration="6.897907384s" podCreationTimestamp="2025-10-10 08:40:26 +0000 UTC" firstStartedPulling="2025-10-10 08:40:28.831855524 +0000 UTC m=+8175.927013740" lastFinishedPulling="2025-10-10 08:40:32.32408239 +0000 UTC m=+8179.419240586" observedRunningTime="2025-10-10 08:40:32.895913867 +0000 UTC m=+8179.991072063" watchObservedRunningTime="2025-10-10 08:40:32.897907384 +0000 UTC m=+8179.993065580" Oct 10 08:40:37 crc kubenswrapper[4822]: I1010 08:40:37.373703 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:37 crc kubenswrapper[4822]: I1010 08:40:37.374569 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:37 crc kubenswrapper[4822]: I1010 08:40:37.446895 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:37 crc kubenswrapper[4822]: I1010 08:40:37.996546 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:38 crc kubenswrapper[4822]: I1010 08:40:38.059727 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-th9k6"] Oct 10 08:40:39 crc kubenswrapper[4822]: I1010 08:40:39.956849 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-th9k6" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="registry-server" containerID="cri-o://625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de" gracePeriod=2 Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.451017 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.624176 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-catalog-content\") pod \"2010f4c4-1486-4c68-b242-f3d08891a078\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.624227 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-utilities\") pod \"2010f4c4-1486-4c68-b242-f3d08891a078\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.624279 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44tcl\" (UniqueName: \"kubernetes.io/projected/2010f4c4-1486-4c68-b242-f3d08891a078-kube-api-access-44tcl\") pod \"2010f4c4-1486-4c68-b242-f3d08891a078\" (UID: \"2010f4c4-1486-4c68-b242-f3d08891a078\") " Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.625550 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-utilities" (OuterVolumeSpecName: "utilities") pod "2010f4c4-1486-4c68-b242-f3d08891a078" (UID: "2010f4c4-1486-4c68-b242-f3d08891a078"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.630601 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2010f4c4-1486-4c68-b242-f3d08891a078-kube-api-access-44tcl" (OuterVolumeSpecName: "kube-api-access-44tcl") pod "2010f4c4-1486-4c68-b242-f3d08891a078" (UID: "2010f4c4-1486-4c68-b242-f3d08891a078"). InnerVolumeSpecName "kube-api-access-44tcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.699176 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2010f4c4-1486-4c68-b242-f3d08891a078" (UID: "2010f4c4-1486-4c68-b242-f3d08891a078"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.726470 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.726511 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2010f4c4-1486-4c68-b242-f3d08891a078-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.726524 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44tcl\" (UniqueName: \"kubernetes.io/projected/2010f4c4-1486-4c68-b242-f3d08891a078-kube-api-access-44tcl\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.967893 4822 generic.go:334] "Generic (PLEG): container finished" podID="2010f4c4-1486-4c68-b242-f3d08891a078" containerID="625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de" exitCode=0 Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.967914 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerDied","Data":"625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de"} Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.967930 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-th9k6" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.967992 4822 scope.go:117] "RemoveContainer" containerID="625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de" Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.967979 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-th9k6" event={"ID":"2010f4c4-1486-4c68-b242-f3d08891a078","Type":"ContainerDied","Data":"19bbeb5160993e1f057ddd430f74afa700c4dbc2f2009fc76db41e3401f0227c"} Oct 10 08:40:40 crc kubenswrapper[4822]: I1010 08:40:40.999241 4822 scope.go:117] "RemoveContainer" containerID="29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.016319 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-th9k6"] Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.027676 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-th9k6"] Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.028729 4822 scope.go:117] "RemoveContainer" containerID="3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.102208 4822 scope.go:117] "RemoveContainer" containerID="625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de" Oct 10 08:40:41 crc kubenswrapper[4822]: E1010 08:40:41.102825 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de\": container with ID starting with 625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de not found: ID does not exist" containerID="625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.102858 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de"} err="failed to get container status \"625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de\": rpc error: code = NotFound desc = could not find container \"625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de\": container with ID starting with 625ecb80ea4a3ff504668e71c65c6ee231d220add825421d8c8cd4cea4ff25de not found: ID does not exist" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.102877 4822 scope.go:117] "RemoveContainer" containerID="29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b" Oct 10 08:40:41 crc kubenswrapper[4822]: E1010 08:40:41.103640 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b\": container with ID starting with 29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b not found: ID does not exist" containerID="29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.103673 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b"} err="failed to get container status \"29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b\": rpc error: code = NotFound desc = could not find container \"29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b\": container with ID starting with 29618405083aab7fb4272221111a392bbdca85ccff8f105df9c7cd50401ecb2b not found: ID does not exist" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.103687 4822 scope.go:117] "RemoveContainer" containerID="3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55" Oct 10 08:40:41 crc kubenswrapper[4822]: E1010 08:40:41.103996 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55\": container with ID starting with 3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55 not found: ID does not exist" containerID="3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.104020 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55"} err="failed to get container status \"3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55\": rpc error: code = NotFound desc = could not find container \"3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55\": container with ID starting with 3874fb968d21bad8736ca8afcbdf33849c8d7922ce675c1b6edaf120a476bf55 not found: ID does not exist" Oct 10 08:40:41 crc kubenswrapper[4822]: I1010 08:40:41.665499 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" path="/var/lib/kubelet/pods/2010f4c4-1486-4c68-b242-f3d08891a078/volumes" Oct 10 08:41:31 crc kubenswrapper[4822]: I1010 08:41:31.337006 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:41:31 crc kubenswrapper[4822]: I1010 08:41:31.337718 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:42:01 crc kubenswrapper[4822]: I1010 08:42:01.337476 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:42:01 crc kubenswrapper[4822]: I1010 08:42:01.338232 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:42:31 crc kubenswrapper[4822]: I1010 08:42:31.336604 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:42:31 crc kubenswrapper[4822]: I1010 08:42:31.337723 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:42:31 crc kubenswrapper[4822]: I1010 08:42:31.337841 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:42:31 crc kubenswrapper[4822]: I1010 08:42:31.339524 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:42:31 crc kubenswrapper[4822]: I1010 08:42:31.339631 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4" gracePeriod=600 Oct 10 08:42:31 crc kubenswrapper[4822]: E1010 08:42:31.488655 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86167202_f72a_4271_bdbe_32ba0bf71fff.slice/crio-conmon-979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:42:32 crc kubenswrapper[4822]: I1010 08:42:32.212261 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4" exitCode=0 Oct 10 08:42:32 crc kubenswrapper[4822]: I1010 08:42:32.212318 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4"} Oct 10 08:42:32 crc kubenswrapper[4822]: I1010 08:42:32.212562 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5"} Oct 10 08:42:32 crc kubenswrapper[4822]: I1010 08:42:32.212590 4822 scope.go:117] "RemoveContainer" containerID="5901a43981f232cf9bd41bff6af4ed3009b99a9ff55ec231373d7d3b928829d1" Oct 10 08:43:32 crc kubenswrapper[4822]: I1010 08:43:32.853147 4822 generic.go:334] "Generic (PLEG): container finished" podID="afd1eb20-744e-41a9-84b0-0dfb89dc1cea" containerID="43e9ca65388e688d763be6d0c260e0d7087ea12b267f6095c1606f51b01fa584" exitCode=0 Oct 10 08:43:32 crc kubenswrapper[4822]: I1010 08:43:32.853232 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" event={"ID":"afd1eb20-744e-41a9-84b0-0dfb89dc1cea","Type":"ContainerDied","Data":"43e9ca65388e688d763be6d0c260e0d7087ea12b267f6095c1606f51b01fa584"} Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.346092 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546376 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-1\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546479 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ssh-key\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546637 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-0\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546678 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfjgs\" (UniqueName: \"kubernetes.io/projected/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-kube-api-access-gfjgs\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546740 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-inventory\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546793 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-1\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.546981 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-0\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.547035 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-combined-ca-bundle\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.547072 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ceph\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.547108 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-1\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.547240 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-0\") pod \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\" (UID: \"afd1eb20-744e-41a9-84b0-0dfb89dc1cea\") " Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.557675 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-kube-api-access-gfjgs" (OuterVolumeSpecName: "kube-api-access-gfjgs") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "kube-api-access-gfjgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.557675 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ceph" (OuterVolumeSpecName: "ceph") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.571952 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.591068 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.596376 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.597965 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.606590 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.608574 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.617207 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-inventory" (OuterVolumeSpecName: "inventory") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.623251 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.639841 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "afd1eb20-744e-41a9-84b0-0dfb89dc1cea" (UID: "afd1eb20-744e-41a9-84b0-0dfb89dc1cea"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652229 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652494 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652510 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652526 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652546 4822 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652561 4822 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652574 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652587 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652600 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfjgs\" (UniqueName: \"kubernetes.io/projected/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-kube-api-access-gfjgs\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652613 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.652631 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/afd1eb20-744e-41a9-84b0-0dfb89dc1cea-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.877306 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" event={"ID":"afd1eb20-744e-41a9-84b0-0dfb89dc1cea","Type":"ContainerDied","Data":"6f73c4d70f80a8bc2f29108b944a027153c5a26444f7c3a8c8cbb4044154bac7"} Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.877670 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f73c4d70f80a8bc2f29108b944a027153c5a26444f7c3a8c8cbb4044154bac7" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.877347 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-frjzw" Oct 10 08:43:34 crc kubenswrapper[4822]: I1010 08:43:34.990548 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8tcnv"] Oct 10 08:43:35 crc kubenswrapper[4822]: E1010 08:43:34.991257 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="registry-server" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.991286 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="registry-server" Oct 10 08:43:35 crc kubenswrapper[4822]: E1010 08:43:34.991391 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="extract-content" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.991408 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="extract-content" Oct 10 08:43:35 crc kubenswrapper[4822]: E1010 08:43:34.991443 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="extract-utilities" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.991455 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="extract-utilities" Oct 10 08:43:35 crc kubenswrapper[4822]: E1010 08:43:34.991498 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd1eb20-744e-41a9-84b0-0dfb89dc1cea" containerName="nova-cell1-openstack-openstack-cell1" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.991509 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd1eb20-744e-41a9-84b0-0dfb89dc1cea" containerName="nova-cell1-openstack-openstack-cell1" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.991888 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="2010f4c4-1486-4c68-b242-f3d08891a078" containerName="registry-server" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.991905 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd1eb20-744e-41a9-84b0-0dfb89dc1cea" containerName="nova-cell1-openstack-openstack-cell1" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.993759 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.996204 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.996372 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.996648 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.996902 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:34.997032 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.008675 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8tcnv"] Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.164266 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceph\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.164352 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.164452 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-inventory\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.164511 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.164544 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.164784 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5b5\" (UniqueName: \"kubernetes.io/projected/94266217-0cc8-4e5c-a9bd-155671c58a19-kube-api-access-dw5b5\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.165833 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.165934 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267274 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267361 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267524 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceph\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267575 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267622 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-inventory\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267658 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267693 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.267744 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5b5\" (UniqueName: \"kubernetes.io/projected/94266217-0cc8-4e5c-a9bd-155671c58a19-kube-api-access-dw5b5\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.272302 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-inventory\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.272332 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.272878 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.277034 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.279682 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.279795 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.285208 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceph\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.290485 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5b5\" (UniqueName: \"kubernetes.io/projected/94266217-0cc8-4e5c-a9bd-155671c58a19-kube-api-access-dw5b5\") pod \"telemetry-openstack-openstack-cell1-8tcnv\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.351712 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:43:35 crc kubenswrapper[4822]: I1010 08:43:35.973908 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8tcnv"] Oct 10 08:43:35 crc kubenswrapper[4822]: W1010 08:43:35.978765 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94266217_0cc8_4e5c_a9bd_155671c58a19.slice/crio-910b934cc66365028f084b23faaa4fe3e0f73d348de2102e0b8cc63e3010e159 WatchSource:0}: Error finding container 910b934cc66365028f084b23faaa4fe3e0f73d348de2102e0b8cc63e3010e159: Status 404 returned error can't find the container with id 910b934cc66365028f084b23faaa4fe3e0f73d348de2102e0b8cc63e3010e159 Oct 10 08:43:36 crc kubenswrapper[4822]: I1010 08:43:36.904980 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" event={"ID":"94266217-0cc8-4e5c-a9bd-155671c58a19","Type":"ContainerStarted","Data":"47bc52a76fe5f535c7691efc255455bc9f44b390e5775035d8c1a52c99296b11"} Oct 10 08:43:36 crc kubenswrapper[4822]: I1010 08:43:36.905378 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" event={"ID":"94266217-0cc8-4e5c-a9bd-155671c58a19","Type":"ContainerStarted","Data":"910b934cc66365028f084b23faaa4fe3e0f73d348de2102e0b8cc63e3010e159"} Oct 10 08:44:31 crc kubenswrapper[4822]: I1010 08:44:31.336426 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:44:31 crc kubenswrapper[4822]: I1010 08:44:31.337101 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.172107 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" podStartSLOduration=85.575101484 podStartE2EDuration="1m26.172066082s" podCreationTimestamp="2025-10-10 08:43:34 +0000 UTC" firstStartedPulling="2025-10-10 08:43:35.981341029 +0000 UTC m=+8363.076499225" lastFinishedPulling="2025-10-10 08:43:36.578305637 +0000 UTC m=+8363.673463823" observedRunningTime="2025-10-10 08:43:36.936885145 +0000 UTC m=+8364.032043341" watchObservedRunningTime="2025-10-10 08:45:00.172066082 +0000 UTC m=+8447.267224288" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.183525 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc"] Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.186015 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.190888 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.190910 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.195978 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc"] Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.317313 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66d35533-a39e-4021-8bfa-764b5bf70331-secret-volume\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.317402 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66d35533-a39e-4021-8bfa-764b5bf70331-config-volume\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.317501 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d827s\" (UniqueName: \"kubernetes.io/projected/66d35533-a39e-4021-8bfa-764b5bf70331-kube-api-access-d827s\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.419858 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d827s\" (UniqueName: \"kubernetes.io/projected/66d35533-a39e-4021-8bfa-764b5bf70331-kube-api-access-d827s\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.419983 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66d35533-a39e-4021-8bfa-764b5bf70331-secret-volume\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.420033 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66d35533-a39e-4021-8bfa-764b5bf70331-config-volume\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.421175 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66d35533-a39e-4021-8bfa-764b5bf70331-config-volume\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.427389 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66d35533-a39e-4021-8bfa-764b5bf70331-secret-volume\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.439599 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d827s\" (UniqueName: \"kubernetes.io/projected/66d35533-a39e-4021-8bfa-764b5bf70331-kube-api-access-d827s\") pod \"collect-profiles-29334765-tvkkc\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:00 crc kubenswrapper[4822]: I1010 08:45:00.526297 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:01 crc kubenswrapper[4822]: I1010 08:45:01.019786 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc"] Oct 10 08:45:01 crc kubenswrapper[4822]: I1010 08:45:01.337498 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:45:01 crc kubenswrapper[4822]: I1010 08:45:01.338007 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:45:01 crc kubenswrapper[4822]: I1010 08:45:01.856305 4822 generic.go:334] "Generic (PLEG): container finished" podID="66d35533-a39e-4021-8bfa-764b5bf70331" containerID="103a39b6e32f7396fe4b083bdeed0a6a1db569155b5a9f4e0e76b48bf48fd1a0" exitCode=0 Oct 10 08:45:01 crc kubenswrapper[4822]: I1010 08:45:01.856365 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" event={"ID":"66d35533-a39e-4021-8bfa-764b5bf70331","Type":"ContainerDied","Data":"103a39b6e32f7396fe4b083bdeed0a6a1db569155b5a9f4e0e76b48bf48fd1a0"} Oct 10 08:45:01 crc kubenswrapper[4822]: I1010 08:45:01.856393 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" event={"ID":"66d35533-a39e-4021-8bfa-764b5bf70331","Type":"ContainerStarted","Data":"a29232d94560ff84d87c9b9773c57a7a4e221fae7f8d951800b03f8fb47e3b08"} Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.244898 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.393732 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66d35533-a39e-4021-8bfa-764b5bf70331-config-volume\") pod \"66d35533-a39e-4021-8bfa-764b5bf70331\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.393950 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d827s\" (UniqueName: \"kubernetes.io/projected/66d35533-a39e-4021-8bfa-764b5bf70331-kube-api-access-d827s\") pod \"66d35533-a39e-4021-8bfa-764b5bf70331\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.393999 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66d35533-a39e-4021-8bfa-764b5bf70331-secret-volume\") pod \"66d35533-a39e-4021-8bfa-764b5bf70331\" (UID: \"66d35533-a39e-4021-8bfa-764b5bf70331\") " Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.395408 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d35533-a39e-4021-8bfa-764b5bf70331-config-volume" (OuterVolumeSpecName: "config-volume") pod "66d35533-a39e-4021-8bfa-764b5bf70331" (UID: "66d35533-a39e-4021-8bfa-764b5bf70331"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.401351 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d35533-a39e-4021-8bfa-764b5bf70331-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "66d35533-a39e-4021-8bfa-764b5bf70331" (UID: "66d35533-a39e-4021-8bfa-764b5bf70331"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.401943 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d35533-a39e-4021-8bfa-764b5bf70331-kube-api-access-d827s" (OuterVolumeSpecName: "kube-api-access-d827s") pod "66d35533-a39e-4021-8bfa-764b5bf70331" (UID: "66d35533-a39e-4021-8bfa-764b5bf70331"). InnerVolumeSpecName "kube-api-access-d827s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.496770 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d827s\" (UniqueName: \"kubernetes.io/projected/66d35533-a39e-4021-8bfa-764b5bf70331-kube-api-access-d827s\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.496833 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66d35533-a39e-4021-8bfa-764b5bf70331-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.496843 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66d35533-a39e-4021-8bfa-764b5bf70331-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.881747 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" event={"ID":"66d35533-a39e-4021-8bfa-764b5bf70331","Type":"ContainerDied","Data":"a29232d94560ff84d87c9b9773c57a7a4e221fae7f8d951800b03f8fb47e3b08"} Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.881828 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-tvkkc" Oct 10 08:45:03 crc kubenswrapper[4822]: I1010 08:45:03.881846 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a29232d94560ff84d87c9b9773c57a7a4e221fae7f8d951800b03f8fb47e3b08" Oct 10 08:45:04 crc kubenswrapper[4822]: I1010 08:45:04.331752 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj"] Oct 10 08:45:04 crc kubenswrapper[4822]: I1010 08:45:04.344178 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-cjcpj"] Oct 10 08:45:05 crc kubenswrapper[4822]: I1010 08:45:05.672852 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f8e2c6-0581-44b8-ac9e-2fde66715ab5" path="/var/lib/kubelet/pods/d8f8e2c6-0581-44b8-ac9e-2fde66715ab5/volumes" Oct 10 08:45:31 crc kubenswrapper[4822]: I1010 08:45:31.336596 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:45:31 crc kubenswrapper[4822]: I1010 08:45:31.337167 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:45:31 crc kubenswrapper[4822]: I1010 08:45:31.337222 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:45:31 crc kubenswrapper[4822]: I1010 08:45:31.338459 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:45:31 crc kubenswrapper[4822]: I1010 08:45:31.338532 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" gracePeriod=600 Oct 10 08:45:31 crc kubenswrapper[4822]: E1010 08:45:31.461289 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:45:32 crc kubenswrapper[4822]: I1010 08:45:32.255341 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" exitCode=0 Oct 10 08:45:32 crc kubenswrapper[4822]: I1010 08:45:32.255403 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5"} Oct 10 08:45:32 crc kubenswrapper[4822]: I1010 08:45:32.256086 4822 scope.go:117] "RemoveContainer" containerID="979d861a84ab20bb6e00fbd74ad33dbc8f1ca525906a25ff9b40e12809ddd0d4" Oct 10 08:45:32 crc kubenswrapper[4822]: I1010 08:45:32.256870 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:45:32 crc kubenswrapper[4822]: E1010 08:45:32.257186 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:45:39 crc kubenswrapper[4822]: I1010 08:45:39.098007 4822 scope.go:117] "RemoveContainer" containerID="01c0eb829b912fe4aea4d733c6697631f0b82f1ecec271642cbc6268123c64b1" Oct 10 08:45:45 crc kubenswrapper[4822]: I1010 08:45:45.651951 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:45:45 crc kubenswrapper[4822]: E1010 08:45:45.653509 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:45:56 crc kubenswrapper[4822]: I1010 08:45:56.650915 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:45:56 crc kubenswrapper[4822]: E1010 08:45:56.651952 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:46:11 crc kubenswrapper[4822]: I1010 08:46:11.650758 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:46:11 crc kubenswrapper[4822]: E1010 08:46:11.651843 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:46:26 crc kubenswrapper[4822]: I1010 08:46:26.651333 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:46:26 crc kubenswrapper[4822]: E1010 08:46:26.654020 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:46:39 crc kubenswrapper[4822]: I1010 08:46:39.651606 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:46:39 crc kubenswrapper[4822]: E1010 08:46:39.653063 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:46:51 crc kubenswrapper[4822]: I1010 08:46:51.650827 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:46:51 crc kubenswrapper[4822]: E1010 08:46:51.651662 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:47:02 crc kubenswrapper[4822]: I1010 08:47:02.650854 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:47:02 crc kubenswrapper[4822]: E1010 08:47:02.651825 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:47:15 crc kubenswrapper[4822]: I1010 08:47:15.651088 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:47:15 crc kubenswrapper[4822]: E1010 08:47:15.651869 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:47:27 crc kubenswrapper[4822]: I1010 08:47:27.652150 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:47:27 crc kubenswrapper[4822]: E1010 08:47:27.653445 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:47:41 crc kubenswrapper[4822]: I1010 08:47:41.770840 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" event={"ID":"94266217-0cc8-4e5c-a9bd-155671c58a19","Type":"ContainerDied","Data":"47bc52a76fe5f535c7691efc255455bc9f44b390e5775035d8c1a52c99296b11"} Oct 10 08:47:41 crc kubenswrapper[4822]: I1010 08:47:41.770918 4822 generic.go:334] "Generic (PLEG): container finished" podID="94266217-0cc8-4e5c-a9bd-155671c58a19" containerID="47bc52a76fe5f535c7691efc255455bc9f44b390e5775035d8c1a52c99296b11" exitCode=0 Oct 10 08:47:42 crc kubenswrapper[4822]: I1010 08:47:42.650103 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:47:42 crc kubenswrapper[4822]: E1010 08:47:42.650368 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.207170 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259272 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-2\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259360 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw5b5\" (UniqueName: \"kubernetes.io/projected/94266217-0cc8-4e5c-a9bd-155671c58a19-kube-api-access-dw5b5\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259438 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceph\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259585 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-inventory\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259702 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ssh-key\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259731 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-0\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259764 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-1\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.259785 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-telemetry-combined-ca-bundle\") pod \"94266217-0cc8-4e5c-a9bd-155671c58a19\" (UID: \"94266217-0cc8-4e5c-a9bd-155671c58a19\") " Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.266082 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceph" (OuterVolumeSpecName: "ceph") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.267038 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.270777 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94266217-0cc8-4e5c-a9bd-155671c58a19-kube-api-access-dw5b5" (OuterVolumeSpecName: "kube-api-access-dw5b5") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "kube-api-access-dw5b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.296713 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.297962 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.301792 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.303392 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-inventory" (OuterVolumeSpecName: "inventory") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.311356 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94266217-0cc8-4e5c-a9bd-155671c58a19" (UID: "94266217-0cc8-4e5c-a9bd-155671c58a19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362444 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362489 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362502 4822 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362515 4822 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362526 4822 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362535 4822 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362547 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw5b5\" (UniqueName: \"kubernetes.io/projected/94266217-0cc8-4e5c-a9bd-155671c58a19-kube-api-access-dw5b5\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.362556 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94266217-0cc8-4e5c-a9bd-155671c58a19-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.791818 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" event={"ID":"94266217-0cc8-4e5c-a9bd-155671c58a19","Type":"ContainerDied","Data":"910b934cc66365028f084b23faaa4fe3e0f73d348de2102e0b8cc63e3010e159"} Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.792117 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="910b934cc66365028f084b23faaa4fe3e0f73d348de2102e0b8cc63e3010e159" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.791869 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tcnv" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.882246 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-n9tvx"] Oct 10 08:47:43 crc kubenswrapper[4822]: E1010 08:47:43.882847 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d35533-a39e-4021-8bfa-764b5bf70331" containerName="collect-profiles" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.882872 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d35533-a39e-4021-8bfa-764b5bf70331" containerName="collect-profiles" Oct 10 08:47:43 crc kubenswrapper[4822]: E1010 08:47:43.882908 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94266217-0cc8-4e5c-a9bd-155671c58a19" containerName="telemetry-openstack-openstack-cell1" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.882918 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="94266217-0cc8-4e5c-a9bd-155671c58a19" containerName="telemetry-openstack-openstack-cell1" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.883180 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="94266217-0cc8-4e5c-a9bd-155671c58a19" containerName="telemetry-openstack-openstack-cell1" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.883228 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d35533-a39e-4021-8bfa-764b5bf70331" containerName="collect-profiles" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.884273 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.886592 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.886791 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.887005 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.887153 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.889267 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.916109 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-n9tvx"] Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.977377 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.977713 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqfc\" (UniqueName: \"kubernetes.io/projected/9d03d865-d40b-44f1-adf6-bc451617f98a-kube-api-access-wnqfc\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.977762 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.977820 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.977856 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:43 crc kubenswrapper[4822]: I1010 08:47:43.978045 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.079758 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.079966 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqfc\" (UniqueName: \"kubernetes.io/projected/9d03d865-d40b-44f1-adf6-bc451617f98a-kube-api-access-wnqfc\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.080011 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.080056 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.080083 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.080177 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.085612 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.089211 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.089213 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.089312 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.090006 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.100109 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqfc\" (UniqueName: \"kubernetes.io/projected/9d03d865-d40b-44f1-adf6-bc451617f98a-kube-api-access-wnqfc\") pod \"neutron-sriov-openstack-openstack-cell1-n9tvx\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.213180 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.761401 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-n9tvx"] Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.767695 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:47:44 crc kubenswrapper[4822]: I1010 08:47:44.809617 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" event={"ID":"9d03d865-d40b-44f1-adf6-bc451617f98a","Type":"ContainerStarted","Data":"cc261bde6dd174f444f9970979b82b1ade54c95888e2a7f187687e1334ec6b58"} Oct 10 08:47:45 crc kubenswrapper[4822]: I1010 08:47:45.819312 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" event={"ID":"9d03d865-d40b-44f1-adf6-bc451617f98a","Type":"ContainerStarted","Data":"67681b68a85e2e9f25603d49de705c382a3002f1cc45f41dc2a6adb930099532"} Oct 10 08:47:45 crc kubenswrapper[4822]: I1010 08:47:45.845869 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" podStartSLOduration=2.365720463 podStartE2EDuration="2.845851628s" podCreationTimestamp="2025-10-10 08:47:43 +0000 UTC" firstStartedPulling="2025-10-10 08:47:44.767424897 +0000 UTC m=+8611.862583093" lastFinishedPulling="2025-10-10 08:47:45.247556062 +0000 UTC m=+8612.342714258" observedRunningTime="2025-10-10 08:47:45.833912276 +0000 UTC m=+8612.929070482" watchObservedRunningTime="2025-10-10 08:47:45.845851628 +0000 UTC m=+8612.941009824" Oct 10 08:47:55 crc kubenswrapper[4822]: I1010 08:47:55.650692 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:47:55 crc kubenswrapper[4822]: E1010 08:47:55.651455 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:48:06 crc kubenswrapper[4822]: I1010 08:48:06.650790 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:48:06 crc kubenswrapper[4822]: E1010 08:48:06.652209 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:48:17 crc kubenswrapper[4822]: I1010 08:48:17.650485 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:48:17 crc kubenswrapper[4822]: E1010 08:48:17.651882 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:48:28 crc kubenswrapper[4822]: I1010 08:48:28.650334 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:48:28 crc kubenswrapper[4822]: E1010 08:48:28.651161 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:48:39 crc kubenswrapper[4822]: I1010 08:48:39.650866 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:48:39 crc kubenswrapper[4822]: E1010 08:48:39.652028 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:48:53 crc kubenswrapper[4822]: I1010 08:48:53.659087 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:48:53 crc kubenswrapper[4822]: E1010 08:48:53.659978 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:49:08 crc kubenswrapper[4822]: I1010 08:49:08.650846 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:49:08 crc kubenswrapper[4822]: E1010 08:49:08.651650 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:49:23 crc kubenswrapper[4822]: I1010 08:49:23.666065 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:49:23 crc kubenswrapper[4822]: E1010 08:49:23.667140 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:49:37 crc kubenswrapper[4822]: I1010 08:49:37.650455 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:49:37 crc kubenswrapper[4822]: E1010 08:49:37.651328 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:49:50 crc kubenswrapper[4822]: I1010 08:49:50.650283 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:49:50 crc kubenswrapper[4822]: E1010 08:49:50.651168 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:50:01 crc kubenswrapper[4822]: I1010 08:50:01.651201 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:50:01 crc kubenswrapper[4822]: E1010 08:50:01.652077 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:50:12 crc kubenswrapper[4822]: I1010 08:50:12.650997 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:50:12 crc kubenswrapper[4822]: E1010 08:50:12.652347 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.305212 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mdvlp"] Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.316219 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.345296 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdvlp"] Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.346870 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-catalog-content\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.346974 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-utilities\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.346997 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lzjv\" (UniqueName: \"kubernetes.io/projected/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-kube-api-access-8lzjv\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.449710 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-catalog-content\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.450008 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-utilities\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.450054 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lzjv\" (UniqueName: \"kubernetes.io/projected/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-kube-api-access-8lzjv\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.450573 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-catalog-content\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.450917 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-utilities\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.473214 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lzjv\" (UniqueName: \"kubernetes.io/projected/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-kube-api-access-8lzjv\") pod \"certified-operators-mdvlp\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:21 crc kubenswrapper[4822]: I1010 08:50:21.657596 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:22 crc kubenswrapper[4822]: I1010 08:50:22.162323 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdvlp"] Oct 10 08:50:22 crc kubenswrapper[4822]: I1010 08:50:22.570067 4822 generic.go:334] "Generic (PLEG): container finished" podID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerID="751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95" exitCode=0 Oct 10 08:50:22 crc kubenswrapper[4822]: I1010 08:50:22.570129 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerDied","Data":"751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95"} Oct 10 08:50:22 crc kubenswrapper[4822]: I1010 08:50:22.570474 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerStarted","Data":"912d66e2f237d9aa43bdfcf4d414334d72583e42f1d673c48e3c452ed770c943"} Oct 10 08:50:24 crc kubenswrapper[4822]: I1010 08:50:24.594394 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerStarted","Data":"055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56"} Oct 10 08:50:25 crc kubenswrapper[4822]: I1010 08:50:25.605221 4822 generic.go:334] "Generic (PLEG): container finished" podID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerID="055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56" exitCode=0 Oct 10 08:50:25 crc kubenswrapper[4822]: I1010 08:50:25.605312 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerDied","Data":"055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56"} Oct 10 08:50:26 crc kubenswrapper[4822]: I1010 08:50:26.618735 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerStarted","Data":"993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a"} Oct 10 08:50:26 crc kubenswrapper[4822]: I1010 08:50:26.648502 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mdvlp" podStartSLOduration=1.994377249 podStartE2EDuration="5.64847903s" podCreationTimestamp="2025-10-10 08:50:21 +0000 UTC" firstStartedPulling="2025-10-10 08:50:22.574550063 +0000 UTC m=+8769.669708309" lastFinishedPulling="2025-10-10 08:50:26.228651894 +0000 UTC m=+8773.323810090" observedRunningTime="2025-10-10 08:50:26.648347576 +0000 UTC m=+8773.743505772" watchObservedRunningTime="2025-10-10 08:50:26.64847903 +0000 UTC m=+8773.743637236" Oct 10 08:50:26 crc kubenswrapper[4822]: I1010 08:50:26.650714 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:50:26 crc kubenswrapper[4822]: E1010 08:50:26.651116 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:50:29 crc kubenswrapper[4822]: I1010 08:50:29.793307 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pxfcf"] Oct 10 08:50:29 crc kubenswrapper[4822]: I1010 08:50:29.797190 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:29 crc kubenswrapper[4822]: I1010 08:50:29.821313 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxfcf"] Oct 10 08:50:29 crc kubenswrapper[4822]: I1010 08:50:29.970679 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnnc\" (UniqueName: \"kubernetes.io/projected/c9b213c2-27d0-4150-8ca1-df2914563040-kube-api-access-8tnnc\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:29 crc kubenswrapper[4822]: I1010 08:50:29.970768 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-catalog-content\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:29 crc kubenswrapper[4822]: I1010 08:50:29.970890 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-utilities\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.072874 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnnc\" (UniqueName: \"kubernetes.io/projected/c9b213c2-27d0-4150-8ca1-df2914563040-kube-api-access-8tnnc\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.072948 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-catalog-content\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.073011 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-utilities\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.073543 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-utilities\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.073573 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-catalog-content\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.095286 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnnc\" (UniqueName: \"kubernetes.io/projected/c9b213c2-27d0-4150-8ca1-df2914563040-kube-api-access-8tnnc\") pod \"community-operators-pxfcf\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.124388 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:30 crc kubenswrapper[4822]: W1010 08:50:30.749518 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b213c2_27d0_4150_8ca1_df2914563040.slice/crio-bae7410d7393a1db63e2ad0fa9e2298857d73a5c38e115b45885a380affbc4d1 WatchSource:0}: Error finding container bae7410d7393a1db63e2ad0fa9e2298857d73a5c38e115b45885a380affbc4d1: Status 404 returned error can't find the container with id bae7410d7393a1db63e2ad0fa9e2298857d73a5c38e115b45885a380affbc4d1 Oct 10 08:50:30 crc kubenswrapper[4822]: I1010 08:50:30.766992 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxfcf"] Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.670968 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.671510 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.685355 4822 generic.go:334] "Generic (PLEG): container finished" podID="c9b213c2-27d0-4150-8ca1-df2914563040" containerID="addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826" exitCode=0 Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.685403 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerDied","Data":"addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826"} Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.685432 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerStarted","Data":"bae7410d7393a1db63e2ad0fa9e2298857d73a5c38e115b45885a380affbc4d1"} Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.736112 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:31 crc kubenswrapper[4822]: I1010 08:50:31.792854 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:32 crc kubenswrapper[4822]: I1010 08:50:32.702086 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerStarted","Data":"a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5"} Oct 10 08:50:33 crc kubenswrapper[4822]: I1010 08:50:33.715388 4822 generic.go:334] "Generic (PLEG): container finished" podID="c9b213c2-27d0-4150-8ca1-df2914563040" containerID="a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5" exitCode=0 Oct 10 08:50:33 crc kubenswrapper[4822]: I1010 08:50:33.715489 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerDied","Data":"a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5"} Oct 10 08:50:34 crc kubenswrapper[4822]: I1010 08:50:34.167374 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdvlp"] Oct 10 08:50:34 crc kubenswrapper[4822]: I1010 08:50:34.167622 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mdvlp" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="registry-server" containerID="cri-o://993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a" gracePeriod=2 Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.297686 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.399962 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lzjv\" (UniqueName: \"kubernetes.io/projected/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-kube-api-access-8lzjv\") pod \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.400099 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-catalog-content\") pod \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.400147 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-utilities\") pod \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\" (UID: \"6e0a36d7-5bf5-4d21-9211-f0ba3c339946\") " Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.402601 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-utilities" (OuterVolumeSpecName: "utilities") pod "6e0a36d7-5bf5-4d21-9211-f0ba3c339946" (UID: "6e0a36d7-5bf5-4d21-9211-f0ba3c339946"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.409298 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-kube-api-access-8lzjv" (OuterVolumeSpecName: "kube-api-access-8lzjv") pod "6e0a36d7-5bf5-4d21-9211-f0ba3c339946" (UID: "6e0a36d7-5bf5-4d21-9211-f0ba3c339946"). InnerVolumeSpecName "kube-api-access-8lzjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.449820 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e0a36d7-5bf5-4d21-9211-f0ba3c339946" (UID: "6e0a36d7-5bf5-4d21-9211-f0ba3c339946"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.503007 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lzjv\" (UniqueName: \"kubernetes.io/projected/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-kube-api-access-8lzjv\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.503046 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.503056 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0a36d7-5bf5-4d21-9211-f0ba3c339946-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.736073 4822 generic.go:334] "Generic (PLEG): container finished" podID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerID="993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a" exitCode=0 Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.736161 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerDied","Data":"993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a"} Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.736196 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdvlp" event={"ID":"6e0a36d7-5bf5-4d21-9211-f0ba3c339946","Type":"ContainerDied","Data":"912d66e2f237d9aa43bdfcf4d414334d72583e42f1d673c48e3c452ed770c943"} Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.736238 4822 scope.go:117] "RemoveContainer" containerID="993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.736399 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdvlp" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.741388 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerStarted","Data":"db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62"} Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.762604 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdvlp"] Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.778049 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mdvlp"] Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.789452 4822 scope.go:117] "RemoveContainer" containerID="055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.791588 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pxfcf" podStartSLOduration=4.022695694 podStartE2EDuration="6.791567236s" podCreationTimestamp="2025-10-10 08:50:29 +0000 UTC" firstStartedPulling="2025-10-10 08:50:31.68905922 +0000 UTC m=+8778.784217436" lastFinishedPulling="2025-10-10 08:50:34.457930782 +0000 UTC m=+8781.553088978" observedRunningTime="2025-10-10 08:50:35.783757352 +0000 UTC m=+8782.878915558" watchObservedRunningTime="2025-10-10 08:50:35.791567236 +0000 UTC m=+8782.886725432" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.815369 4822 scope.go:117] "RemoveContainer" containerID="751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.878792 4822 scope.go:117] "RemoveContainer" containerID="993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a" Oct 10 08:50:35 crc kubenswrapper[4822]: E1010 08:50:35.879215 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a\": container with ID starting with 993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a not found: ID does not exist" containerID="993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.879259 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a"} err="failed to get container status \"993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a\": rpc error: code = NotFound desc = could not find container \"993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a\": container with ID starting with 993e827d44a78a9141d52d71e3d42c32d6d2426c1fb19861bc03cf1452346d7a not found: ID does not exist" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.879284 4822 scope.go:117] "RemoveContainer" containerID="055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56" Oct 10 08:50:35 crc kubenswrapper[4822]: E1010 08:50:35.879541 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56\": container with ID starting with 055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56 not found: ID does not exist" containerID="055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.879568 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56"} err="failed to get container status \"055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56\": rpc error: code = NotFound desc = could not find container \"055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56\": container with ID starting with 055b2537ad235bdcaa39c0c90ab3a8f9237fdfd559ebcf0c78da59b5ec697c56 not found: ID does not exist" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.879588 4822 scope.go:117] "RemoveContainer" containerID="751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95" Oct 10 08:50:35 crc kubenswrapper[4822]: E1010 08:50:35.879962 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95\": container with ID starting with 751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95 not found: ID does not exist" containerID="751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95" Oct 10 08:50:35 crc kubenswrapper[4822]: I1010 08:50:35.879980 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95"} err="failed to get container status \"751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95\": rpc error: code = NotFound desc = could not find container \"751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95\": container with ID starting with 751d933518dc73be60975d1b6225ad6a5e0e598969ca01a80940e68a8708ce95 not found: ID does not exist" Oct 10 08:50:37 crc kubenswrapper[4822]: I1010 08:50:37.672688 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" path="/var/lib/kubelet/pods/6e0a36d7-5bf5-4d21-9211-f0ba3c339946/volumes" Oct 10 08:50:39 crc kubenswrapper[4822]: I1010 08:50:39.651240 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:50:40 crc kubenswrapper[4822]: I1010 08:50:40.125395 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:40 crc kubenswrapper[4822]: I1010 08:50:40.126093 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:40 crc kubenswrapper[4822]: I1010 08:50:40.183444 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:40 crc kubenswrapper[4822]: I1010 08:50:40.810131 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"3adc1a9861f5dc70b03073c466335bc2428e3c8db991845e4f554a19ee4afa02"} Oct 10 08:50:40 crc kubenswrapper[4822]: I1010 08:50:40.878923 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:40 crc kubenswrapper[4822]: I1010 08:50:40.939362 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxfcf"] Oct 10 08:50:42 crc kubenswrapper[4822]: I1010 08:50:42.830523 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pxfcf" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="registry-server" containerID="cri-o://db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62" gracePeriod=2 Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.365545 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.493963 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnnc\" (UniqueName: \"kubernetes.io/projected/c9b213c2-27d0-4150-8ca1-df2914563040-kube-api-access-8tnnc\") pod \"c9b213c2-27d0-4150-8ca1-df2914563040\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.494101 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-catalog-content\") pod \"c9b213c2-27d0-4150-8ca1-df2914563040\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.494250 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-utilities\") pod \"c9b213c2-27d0-4150-8ca1-df2914563040\" (UID: \"c9b213c2-27d0-4150-8ca1-df2914563040\") " Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.495243 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-utilities" (OuterVolumeSpecName: "utilities") pod "c9b213c2-27d0-4150-8ca1-df2914563040" (UID: "c9b213c2-27d0-4150-8ca1-df2914563040"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.495923 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.500429 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b213c2-27d0-4150-8ca1-df2914563040-kube-api-access-8tnnc" (OuterVolumeSpecName: "kube-api-access-8tnnc") pod "c9b213c2-27d0-4150-8ca1-df2914563040" (UID: "c9b213c2-27d0-4150-8ca1-df2914563040"). InnerVolumeSpecName "kube-api-access-8tnnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.598744 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnnc\" (UniqueName: \"kubernetes.io/projected/c9b213c2-27d0-4150-8ca1-df2914563040-kube-api-access-8tnnc\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.840692 4822 generic.go:334] "Generic (PLEG): container finished" podID="c9b213c2-27d0-4150-8ca1-df2914563040" containerID="db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62" exitCode=0 Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.840737 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerDied","Data":"db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62"} Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.840753 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxfcf" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.840777 4822 scope.go:117] "RemoveContainer" containerID="db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.840765 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxfcf" event={"ID":"c9b213c2-27d0-4150-8ca1-df2914563040","Type":"ContainerDied","Data":"bae7410d7393a1db63e2ad0fa9e2298857d73a5c38e115b45885a380affbc4d1"} Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.864253 4822 scope.go:117] "RemoveContainer" containerID="a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.893272 4822 scope.go:117] "RemoveContainer" containerID="addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.933950 4822 scope.go:117] "RemoveContainer" containerID="db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62" Oct 10 08:50:43 crc kubenswrapper[4822]: E1010 08:50:43.934432 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62\": container with ID starting with db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62 not found: ID does not exist" containerID="db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.934482 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62"} err="failed to get container status \"db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62\": rpc error: code = NotFound desc = could not find container \"db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62\": container with ID starting with db8a35cad305c036ef27c4b9403063e9806b354d9724aede0127a6f565f2dd62 not found: ID does not exist" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.934509 4822 scope.go:117] "RemoveContainer" containerID="a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5" Oct 10 08:50:43 crc kubenswrapper[4822]: E1010 08:50:43.935126 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5\": container with ID starting with a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5 not found: ID does not exist" containerID="a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.935154 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5"} err="failed to get container status \"a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5\": rpc error: code = NotFound desc = could not find container \"a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5\": container with ID starting with a1d91509a2e295fa9fe7ba43eddcb7f1ca505d8f8c7f7ce33eeb83f792a3bfe5 not found: ID does not exist" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.935170 4822 scope.go:117] "RemoveContainer" containerID="addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826" Oct 10 08:50:43 crc kubenswrapper[4822]: E1010 08:50:43.935519 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826\": container with ID starting with addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826 not found: ID does not exist" containerID="addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826" Oct 10 08:50:43 crc kubenswrapper[4822]: I1010 08:50:43.935543 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826"} err="failed to get container status \"addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826\": rpc error: code = NotFound desc = could not find container \"addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826\": container with ID starting with addf5971853fee1e6b55b8bbe3433a16c1ead1390857458e1e19253fc7434826 not found: ID does not exist" Oct 10 08:50:44 crc kubenswrapper[4822]: I1010 08:50:44.051312 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9b213c2-27d0-4150-8ca1-df2914563040" (UID: "c9b213c2-27d0-4150-8ca1-df2914563040"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:50:44 crc kubenswrapper[4822]: I1010 08:50:44.110289 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9b213c2-27d0-4150-8ca1-df2914563040-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:44 crc kubenswrapper[4822]: I1010 08:50:44.176334 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxfcf"] Oct 10 08:50:44 crc kubenswrapper[4822]: I1010 08:50:44.188575 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pxfcf"] Oct 10 08:50:45 crc kubenswrapper[4822]: I1010 08:50:45.668937 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" path="/var/lib/kubelet/pods/c9b213c2-27d0-4150-8ca1-df2914563040/volumes" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.029595 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9ssbj"] Oct 10 08:51:32 crc kubenswrapper[4822]: E1010 08:51:32.031444 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="registry-server" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.031479 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="registry-server" Oct 10 08:51:32 crc kubenswrapper[4822]: E1010 08:51:32.031514 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="extract-utilities" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.031532 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="extract-utilities" Oct 10 08:51:32 crc kubenswrapper[4822]: E1010 08:51:32.031575 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="registry-server" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.031596 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="registry-server" Oct 10 08:51:32 crc kubenswrapper[4822]: E1010 08:51:32.031641 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="extract-content" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.031663 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="extract-content" Oct 10 08:51:32 crc kubenswrapper[4822]: E1010 08:51:32.031701 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="extract-content" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.031717 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="extract-content" Oct 10 08:51:32 crc kubenswrapper[4822]: E1010 08:51:32.031740 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="extract-utilities" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.031756 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="extract-utilities" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.032323 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0a36d7-5bf5-4d21-9211-f0ba3c339946" containerName="registry-server" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.032408 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b213c2-27d0-4150-8ca1-df2914563040" containerName="registry-server" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.036706 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.061960 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ssbj"] Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.165654 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-utilities\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.165933 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmg8\" (UniqueName: \"kubernetes.io/projected/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-kube-api-access-nvmg8\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.165984 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-catalog-content\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.268329 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmg8\" (UniqueName: \"kubernetes.io/projected/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-kube-api-access-nvmg8\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.268406 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-catalog-content\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.268486 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-utilities\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.269151 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-utilities\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.269410 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-catalog-content\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.294156 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmg8\" (UniqueName: \"kubernetes.io/projected/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-kube-api-access-nvmg8\") pod \"redhat-operators-9ssbj\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.372719 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:32 crc kubenswrapper[4822]: I1010 08:51:32.858360 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ssbj"] Oct 10 08:51:33 crc kubenswrapper[4822]: I1010 08:51:33.477977 4822 generic.go:334] "Generic (PLEG): container finished" podID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerID="b3dc8a50ec31868381b41576c99ee7bb7022b6c2ab996f49cff0e08c8f011516" exitCode=0 Oct 10 08:51:33 crc kubenswrapper[4822]: I1010 08:51:33.478050 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerDied","Data":"b3dc8a50ec31868381b41576c99ee7bb7022b6c2ab996f49cff0e08c8f011516"} Oct 10 08:51:33 crc kubenswrapper[4822]: I1010 08:51:33.478324 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerStarted","Data":"ee2c4f3cb941d248caee9b59b447306666b56c8bb7da92066672ee051f042584"} Oct 10 08:51:34 crc kubenswrapper[4822]: I1010 08:51:34.977940 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9gs"] Oct 10 08:51:34 crc kubenswrapper[4822]: I1010 08:51:34.981195 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.005773 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9gs"] Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.026651 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-catalog-content\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.026764 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-utilities\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.026901 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbndt\" (UniqueName: \"kubernetes.io/projected/080541fd-c4df-4a05-81ea-801178942dd8-kube-api-access-bbndt\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.129059 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbndt\" (UniqueName: \"kubernetes.io/projected/080541fd-c4df-4a05-81ea-801178942dd8-kube-api-access-bbndt\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.129201 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-catalog-content\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.129316 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-utilities\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.130169 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-utilities\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.130487 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-catalog-content\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.157056 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbndt\" (UniqueName: \"kubernetes.io/projected/080541fd-c4df-4a05-81ea-801178942dd8-kube-api-access-bbndt\") pod \"redhat-marketplace-lj9gs\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.317791 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.498607 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerStarted","Data":"423fbd0b7df6faa0d046423f38d041458c9dad9d81930bef6b2aec0c6f6ee0c4"} Oct 10 08:51:35 crc kubenswrapper[4822]: I1010 08:51:35.822958 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9gs"] Oct 10 08:51:35 crc kubenswrapper[4822]: W1010 08:51:35.829378 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080541fd_c4df_4a05_81ea_801178942dd8.slice/crio-f0400e2b6096ae05b466abbb9e9d26c10cf6ce00aa46ec93771323fe12d9d6d7 WatchSource:0}: Error finding container f0400e2b6096ae05b466abbb9e9d26c10cf6ce00aa46ec93771323fe12d9d6d7: Status 404 returned error can't find the container with id f0400e2b6096ae05b466abbb9e9d26c10cf6ce00aa46ec93771323fe12d9d6d7 Oct 10 08:51:36 crc kubenswrapper[4822]: I1010 08:51:36.514889 4822 generic.go:334] "Generic (PLEG): container finished" podID="080541fd-c4df-4a05-81ea-801178942dd8" containerID="5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72" exitCode=0 Oct 10 08:51:36 crc kubenswrapper[4822]: I1010 08:51:36.516637 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerDied","Data":"5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72"} Oct 10 08:51:36 crc kubenswrapper[4822]: I1010 08:51:36.516662 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerStarted","Data":"f0400e2b6096ae05b466abbb9e9d26c10cf6ce00aa46ec93771323fe12d9d6d7"} Oct 10 08:51:37 crc kubenswrapper[4822]: I1010 08:51:37.530743 4822 generic.go:334] "Generic (PLEG): container finished" podID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerID="423fbd0b7df6faa0d046423f38d041458c9dad9d81930bef6b2aec0c6f6ee0c4" exitCode=0 Oct 10 08:51:37 crc kubenswrapper[4822]: I1010 08:51:37.530827 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerDied","Data":"423fbd0b7df6faa0d046423f38d041458c9dad9d81930bef6b2aec0c6f6ee0c4"} Oct 10 08:51:38 crc kubenswrapper[4822]: I1010 08:51:38.547263 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerStarted","Data":"fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f"} Oct 10 08:51:39 crc kubenswrapper[4822]: I1010 08:51:39.561117 4822 generic.go:334] "Generic (PLEG): container finished" podID="080541fd-c4df-4a05-81ea-801178942dd8" containerID="fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f" exitCode=0 Oct 10 08:51:39 crc kubenswrapper[4822]: I1010 08:51:39.561233 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerDied","Data":"fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f"} Oct 10 08:51:39 crc kubenswrapper[4822]: I1010 08:51:39.563845 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerStarted","Data":"c723f289932630f636f7ab5046bf5ec63c0429ede9ba7cef9b900b08b8629863"} Oct 10 08:51:39 crc kubenswrapper[4822]: I1010 08:51:39.608861 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9ssbj" podStartSLOduration=3.070023592 podStartE2EDuration="8.608829777s" podCreationTimestamp="2025-10-10 08:51:31 +0000 UTC" firstStartedPulling="2025-10-10 08:51:33.480420346 +0000 UTC m=+8840.575578582" lastFinishedPulling="2025-10-10 08:51:39.019226551 +0000 UTC m=+8846.114384767" observedRunningTime="2025-10-10 08:51:39.603462543 +0000 UTC m=+8846.698620759" watchObservedRunningTime="2025-10-10 08:51:39.608829777 +0000 UTC m=+8846.703987983" Oct 10 08:51:41 crc kubenswrapper[4822]: I1010 08:51:41.584154 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerStarted","Data":"836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f"} Oct 10 08:51:41 crc kubenswrapper[4822]: I1010 08:51:41.609384 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lj9gs" podStartSLOduration=4.050441675 podStartE2EDuration="7.609358085s" podCreationTimestamp="2025-10-10 08:51:34 +0000 UTC" firstStartedPulling="2025-10-10 08:51:36.518501162 +0000 UTC m=+8843.613659358" lastFinishedPulling="2025-10-10 08:51:40.077417572 +0000 UTC m=+8847.172575768" observedRunningTime="2025-10-10 08:51:41.600005787 +0000 UTC m=+8848.695163983" watchObservedRunningTime="2025-10-10 08:51:41.609358085 +0000 UTC m=+8848.704516281" Oct 10 08:51:42 crc kubenswrapper[4822]: I1010 08:51:42.373533 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:42 crc kubenswrapper[4822]: I1010 08:51:42.374192 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:44 crc kubenswrapper[4822]: I1010 08:51:44.223569 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9ssbj" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="registry-server" probeResult="failure" output=< Oct 10 08:51:44 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 08:51:44 crc kubenswrapper[4822]: > Oct 10 08:51:45 crc kubenswrapper[4822]: I1010 08:51:45.318627 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:45 crc kubenswrapper[4822]: I1010 08:51:45.319865 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:45 crc kubenswrapper[4822]: I1010 08:51:45.381809 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:45 crc kubenswrapper[4822]: I1010 08:51:45.695453 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:45 crc kubenswrapper[4822]: I1010 08:51:45.759015 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9gs"] Oct 10 08:51:47 crc kubenswrapper[4822]: I1010 08:51:47.650272 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lj9gs" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="registry-server" containerID="cri-o://836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f" gracePeriod=2 Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.214639 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.383926 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-catalog-content\") pod \"080541fd-c4df-4a05-81ea-801178942dd8\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.384120 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbndt\" (UniqueName: \"kubernetes.io/projected/080541fd-c4df-4a05-81ea-801178942dd8-kube-api-access-bbndt\") pod \"080541fd-c4df-4a05-81ea-801178942dd8\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.384239 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-utilities\") pod \"080541fd-c4df-4a05-81ea-801178942dd8\" (UID: \"080541fd-c4df-4a05-81ea-801178942dd8\") " Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.385011 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-utilities" (OuterVolumeSpecName: "utilities") pod "080541fd-c4df-4a05-81ea-801178942dd8" (UID: "080541fd-c4df-4a05-81ea-801178942dd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.393719 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080541fd-c4df-4a05-81ea-801178942dd8-kube-api-access-bbndt" (OuterVolumeSpecName: "kube-api-access-bbndt") pod "080541fd-c4df-4a05-81ea-801178942dd8" (UID: "080541fd-c4df-4a05-81ea-801178942dd8"). InnerVolumeSpecName "kube-api-access-bbndt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.405054 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "080541fd-c4df-4a05-81ea-801178942dd8" (UID: "080541fd-c4df-4a05-81ea-801178942dd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.486700 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.486747 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080541fd-c4df-4a05-81ea-801178942dd8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.486770 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbndt\" (UniqueName: \"kubernetes.io/projected/080541fd-c4df-4a05-81ea-801178942dd8-kube-api-access-bbndt\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.661292 4822 generic.go:334] "Generic (PLEG): container finished" podID="080541fd-c4df-4a05-81ea-801178942dd8" containerID="836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f" exitCode=0 Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.661339 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lj9gs" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.661336 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerDied","Data":"836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f"} Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.661383 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lj9gs" event={"ID":"080541fd-c4df-4a05-81ea-801178942dd8","Type":"ContainerDied","Data":"f0400e2b6096ae05b466abbb9e9d26c10cf6ce00aa46ec93771323fe12d9d6d7"} Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.661403 4822 scope.go:117] "RemoveContainer" containerID="836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.694604 4822 scope.go:117] "RemoveContainer" containerID="fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.703731 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9gs"] Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.719704 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lj9gs"] Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.732253 4822 scope.go:117] "RemoveContainer" containerID="5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.771695 4822 scope.go:117] "RemoveContainer" containerID="836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f" Oct 10 08:51:48 crc kubenswrapper[4822]: E1010 08:51:48.772156 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f\": container with ID starting with 836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f not found: ID does not exist" containerID="836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.772198 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f"} err="failed to get container status \"836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f\": rpc error: code = NotFound desc = could not find container \"836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f\": container with ID starting with 836389dc14cca1c853807a9d9aa40d092351b93a6a4639ab4e596be24db2fc4f not found: ID does not exist" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.772223 4822 scope.go:117] "RemoveContainer" containerID="fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f" Oct 10 08:51:48 crc kubenswrapper[4822]: E1010 08:51:48.772553 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f\": container with ID starting with fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f not found: ID does not exist" containerID="fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.772579 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f"} err="failed to get container status \"fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f\": rpc error: code = NotFound desc = could not find container \"fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f\": container with ID starting with fffdc3cd36b374f7e670d14266e67d3f39253d943b2066db78f69b62e0255c3f not found: ID does not exist" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.772594 4822 scope.go:117] "RemoveContainer" containerID="5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72" Oct 10 08:51:48 crc kubenswrapper[4822]: E1010 08:51:48.772843 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72\": container with ID starting with 5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72 not found: ID does not exist" containerID="5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72" Oct 10 08:51:48 crc kubenswrapper[4822]: I1010 08:51:48.772875 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72"} err="failed to get container status \"5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72\": rpc error: code = NotFound desc = could not find container \"5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72\": container with ID starting with 5d656ed6d39e6ab158b8b3970c019a1610d81b81a8ec376aa81024e241707e72 not found: ID does not exist" Oct 10 08:51:49 crc kubenswrapper[4822]: I1010 08:51:49.664280 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080541fd-c4df-4a05-81ea-801178942dd8" path="/var/lib/kubelet/pods/080541fd-c4df-4a05-81ea-801178942dd8/volumes" Oct 10 08:51:53 crc kubenswrapper[4822]: I1010 08:51:53.140244 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:53 crc kubenswrapper[4822]: I1010 08:51:53.198167 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:53 crc kubenswrapper[4822]: I1010 08:51:53.382981 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ssbj"] Oct 10 08:51:54 crc kubenswrapper[4822]: I1010 08:51:54.732600 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9ssbj" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="registry-server" containerID="cri-o://c723f289932630f636f7ab5046bf5ec63c0429ede9ba7cef9b900b08b8629863" gracePeriod=2 Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.744621 4822 generic.go:334] "Generic (PLEG): container finished" podID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerID="c723f289932630f636f7ab5046bf5ec63c0429ede9ba7cef9b900b08b8629863" exitCode=0 Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.744687 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerDied","Data":"c723f289932630f636f7ab5046bf5ec63c0429ede9ba7cef9b900b08b8629863"} Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.745070 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ssbj" event={"ID":"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f","Type":"ContainerDied","Data":"ee2c4f3cb941d248caee9b59b447306666b56c8bb7da92066672ee051f042584"} Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.745122 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2c4f3cb941d248caee9b59b447306666b56c8bb7da92066672ee051f042584" Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.831222 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.971466 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-utilities\") pod \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.972296 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvmg8\" (UniqueName: \"kubernetes.io/projected/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-kube-api-access-nvmg8\") pod \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.972446 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-catalog-content\") pod \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\" (UID: \"b66b54a0-b7f4-4c71-bc20-5c84f9561b0f\") " Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.974078 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-utilities" (OuterVolumeSpecName: "utilities") pod "b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" (UID: "b66b54a0-b7f4-4c71-bc20-5c84f9561b0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:51:55 crc kubenswrapper[4822]: I1010 08:51:55.978838 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-kube-api-access-nvmg8" (OuterVolumeSpecName: "kube-api-access-nvmg8") pod "b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" (UID: "b66b54a0-b7f4-4c71-bc20-5c84f9561b0f"). InnerVolumeSpecName "kube-api-access-nvmg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.068086 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" (UID: "b66b54a0-b7f4-4c71-bc20-5c84f9561b0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.075449 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.075860 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvmg8\" (UniqueName: \"kubernetes.io/projected/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-kube-api-access-nvmg8\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.075990 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.754612 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ssbj" Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.795087 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ssbj"] Oct 10 08:51:56 crc kubenswrapper[4822]: I1010 08:51:56.806689 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9ssbj"] Oct 10 08:51:57 crc kubenswrapper[4822]: I1010 08:51:57.666333 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" path="/var/lib/kubelet/pods/b66b54a0-b7f4-4c71-bc20-5c84f9561b0f/volumes" Oct 10 08:52:39 crc kubenswrapper[4822]: I1010 08:52:39.263352 4822 generic.go:334] "Generic (PLEG): container finished" podID="9d03d865-d40b-44f1-adf6-bc451617f98a" containerID="67681b68a85e2e9f25603d49de705c382a3002f1cc45f41dc2a6adb930099532" exitCode=0 Oct 10 08:52:39 crc kubenswrapper[4822]: I1010 08:52:39.263457 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" event={"ID":"9d03d865-d40b-44f1-adf6-bc451617f98a","Type":"ContainerDied","Data":"67681b68a85e2e9f25603d49de705c382a3002f1cc45f41dc2a6adb930099532"} Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.814169 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.930229 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ceph\") pod \"9d03d865-d40b-44f1-adf6-bc451617f98a\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.930505 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqfc\" (UniqueName: \"kubernetes.io/projected/9d03d865-d40b-44f1-adf6-bc451617f98a-kube-api-access-wnqfc\") pod \"9d03d865-d40b-44f1-adf6-bc451617f98a\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.930580 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ssh-key\") pod \"9d03d865-d40b-44f1-adf6-bc451617f98a\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.930678 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-agent-neutron-config-0\") pod \"9d03d865-d40b-44f1-adf6-bc451617f98a\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.930765 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-combined-ca-bundle\") pod \"9d03d865-d40b-44f1-adf6-bc451617f98a\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.930941 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-inventory\") pod \"9d03d865-d40b-44f1-adf6-bc451617f98a\" (UID: \"9d03d865-d40b-44f1-adf6-bc451617f98a\") " Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.938033 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9d03d865-d40b-44f1-adf6-bc451617f98a" (UID: "9d03d865-d40b-44f1-adf6-bc451617f98a"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.938845 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ceph" (OuterVolumeSpecName: "ceph") pod "9d03d865-d40b-44f1-adf6-bc451617f98a" (UID: "9d03d865-d40b-44f1-adf6-bc451617f98a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.938928 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d03d865-d40b-44f1-adf6-bc451617f98a-kube-api-access-wnqfc" (OuterVolumeSpecName: "kube-api-access-wnqfc") pod "9d03d865-d40b-44f1-adf6-bc451617f98a" (UID: "9d03d865-d40b-44f1-adf6-bc451617f98a"). InnerVolumeSpecName "kube-api-access-wnqfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.966190 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9d03d865-d40b-44f1-adf6-bc451617f98a" (UID: "9d03d865-d40b-44f1-adf6-bc451617f98a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.968113 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-inventory" (OuterVolumeSpecName: "inventory") pod "9d03d865-d40b-44f1-adf6-bc451617f98a" (UID: "9d03d865-d40b-44f1-adf6-bc451617f98a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:40 crc kubenswrapper[4822]: I1010 08:52:40.968155 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "9d03d865-d40b-44f1-adf6-bc451617f98a" (UID: "9d03d865-d40b-44f1-adf6-bc451617f98a"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.033900 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.033941 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.033951 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqfc\" (UniqueName: \"kubernetes.io/projected/9d03d865-d40b-44f1-adf6-bc451617f98a-kube-api-access-wnqfc\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.033965 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.033977 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.033988 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d03d865-d40b-44f1-adf6-bc451617f98a-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.288494 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" event={"ID":"9d03d865-d40b-44f1-adf6-bc451617f98a","Type":"ContainerDied","Data":"cc261bde6dd174f444f9970979b82b1ade54c95888e2a7f187687e1334ec6b58"} Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.288796 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc261bde6dd174f444f9970979b82b1ade54c95888e2a7f187687e1334ec6b58" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.288568 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-n9tvx" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398295 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4"] Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398767 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="extract-utilities" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398790 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="extract-utilities" Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398845 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="extract-content" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398856 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="extract-content" Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398873 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="extract-utilities" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398880 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="extract-utilities" Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398891 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="extract-content" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398899 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="extract-content" Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398916 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="registry-server" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398923 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="registry-server" Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398930 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="registry-server" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398936 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="registry-server" Oct 10 08:52:41 crc kubenswrapper[4822]: E1010 08:52:41.398959 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d03d865-d40b-44f1-adf6-bc451617f98a" containerName="neutron-sriov-openstack-openstack-cell1" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.398968 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d03d865-d40b-44f1-adf6-bc451617f98a" containerName="neutron-sriov-openstack-openstack-cell1" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.399179 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66b54a0-b7f4-4c71-bc20-5c84f9561b0f" containerName="registry-server" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.399195 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="080541fd-c4df-4a05-81ea-801178942dd8" containerName="registry-server" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.399215 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d03d865-d40b-44f1-adf6-bc451617f98a" containerName="neutron-sriov-openstack-openstack-cell1" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.400299 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.404997 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.405225 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.405019 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.405172 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.405974 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.409753 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4"] Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.545540 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.546158 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/718c7cdd-089d-42bc-8cab-74214e8ddeb3-kube-api-access-vv5q5\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.546365 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.546526 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.546672 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.546783 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.649667 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.649748 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.649963 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.650047 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/718c7cdd-089d-42bc-8cab-74214e8ddeb3-kube-api-access-vv5q5\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.650162 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.650250 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.878531 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.879396 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.880538 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.880623 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.881050 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:41 crc kubenswrapper[4822]: I1010 08:52:41.883995 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/718c7cdd-089d-42bc-8cab-74214e8ddeb3-kube-api-access-vv5q5\") pod \"neutron-dhcp-openstack-openstack-cell1-jb6k4\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:42 crc kubenswrapper[4822]: I1010 08:52:42.019006 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:52:42 crc kubenswrapper[4822]: I1010 08:52:42.575254 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4"] Oct 10 08:52:43 crc kubenswrapper[4822]: I1010 08:52:43.332608 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" event={"ID":"718c7cdd-089d-42bc-8cab-74214e8ddeb3","Type":"ContainerStarted","Data":"ed616d1e3828c4e867d44f58a4d51d00a403e07c6eb81f7018680b9b052bc70e"} Oct 10 08:52:44 crc kubenswrapper[4822]: I1010 08:52:44.344368 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" event={"ID":"718c7cdd-089d-42bc-8cab-74214e8ddeb3","Type":"ContainerStarted","Data":"fafb0fc340b7092b45f532d503eab0619ef7061d9b990b5b872a15a319bcdd98"} Oct 10 08:52:44 crc kubenswrapper[4822]: I1010 08:52:44.368790 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" podStartSLOduration=2.8577252939999997 podStartE2EDuration="3.368706395s" podCreationTimestamp="2025-10-10 08:52:41 +0000 UTC" firstStartedPulling="2025-10-10 08:52:42.582041493 +0000 UTC m=+8909.677199689" lastFinishedPulling="2025-10-10 08:52:43.093022584 +0000 UTC m=+8910.188180790" observedRunningTime="2025-10-10 08:52:44.365582985 +0000 UTC m=+8911.460741191" watchObservedRunningTime="2025-10-10 08:52:44.368706395 +0000 UTC m=+8911.463864611" Oct 10 08:53:01 crc kubenswrapper[4822]: I1010 08:53:01.336856 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:53:01 crc kubenswrapper[4822]: I1010 08:53:01.337426 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:53:31 crc kubenswrapper[4822]: I1010 08:53:31.337269 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:53:31 crc kubenswrapper[4822]: I1010 08:53:31.339081 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:54:01 crc kubenswrapper[4822]: I1010 08:54:01.336923 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:54:01 crc kubenswrapper[4822]: I1010 08:54:01.337447 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:54:01 crc kubenswrapper[4822]: I1010 08:54:01.337844 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:54:01 crc kubenswrapper[4822]: I1010 08:54:01.338676 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3adc1a9861f5dc70b03073c466335bc2428e3c8db991845e4f554a19ee4afa02"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:54:01 crc kubenswrapper[4822]: I1010 08:54:01.338757 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://3adc1a9861f5dc70b03073c466335bc2428e3c8db991845e4f554a19ee4afa02" gracePeriod=600 Oct 10 08:54:02 crc kubenswrapper[4822]: I1010 08:54:02.196308 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="3adc1a9861f5dc70b03073c466335bc2428e3c8db991845e4f554a19ee4afa02" exitCode=0 Oct 10 08:54:02 crc kubenswrapper[4822]: I1010 08:54:02.196385 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"3adc1a9861f5dc70b03073c466335bc2428e3c8db991845e4f554a19ee4afa02"} Oct 10 08:54:02 crc kubenswrapper[4822]: I1010 08:54:02.196903 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02"} Oct 10 08:54:02 crc kubenswrapper[4822]: I1010 08:54:02.196933 4822 scope.go:117] "RemoveContainer" containerID="4e4105f00988599ee743f0761d382680dbca598c4cd7fca9412e1e5fe95463c5" Oct 10 08:56:01 crc kubenswrapper[4822]: I1010 08:56:01.336618 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:56:01 crc kubenswrapper[4822]: I1010 08:56:01.337199 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:56:31 crc kubenswrapper[4822]: I1010 08:56:31.336362 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:56:31 crc kubenswrapper[4822]: I1010 08:56:31.339240 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:57:01 crc kubenswrapper[4822]: I1010 08:57:01.337240 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:57:01 crc kubenswrapper[4822]: I1010 08:57:01.337928 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:57:01 crc kubenswrapper[4822]: I1010 08:57:01.337985 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 08:57:01 crc kubenswrapper[4822]: I1010 08:57:01.338984 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:57:01 crc kubenswrapper[4822]: I1010 08:57:01.339048 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" gracePeriod=600 Oct 10 08:57:01 crc kubenswrapper[4822]: E1010 08:57:01.707232 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:57:02 crc kubenswrapper[4822]: I1010 08:57:02.261282 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" exitCode=0 Oct 10 08:57:02 crc kubenswrapper[4822]: I1010 08:57:02.261337 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02"} Oct 10 08:57:02 crc kubenswrapper[4822]: I1010 08:57:02.261416 4822 scope.go:117] "RemoveContainer" containerID="3adc1a9861f5dc70b03073c466335bc2428e3c8db991845e4f554a19ee4afa02" Oct 10 08:57:02 crc kubenswrapper[4822]: I1010 08:57:02.262353 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:57:02 crc kubenswrapper[4822]: E1010 08:57:02.262941 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:57:12 crc kubenswrapper[4822]: I1010 08:57:12.379068 4822 generic.go:334] "Generic (PLEG): container finished" podID="718c7cdd-089d-42bc-8cab-74214e8ddeb3" containerID="fafb0fc340b7092b45f532d503eab0619ef7061d9b990b5b872a15a319bcdd98" exitCode=0 Oct 10 08:57:12 crc kubenswrapper[4822]: I1010 08:57:12.379191 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" event={"ID":"718c7cdd-089d-42bc-8cab-74214e8ddeb3","Type":"ContainerDied","Data":"fafb0fc340b7092b45f532d503eab0619ef7061d9b990b5b872a15a319bcdd98"} Oct 10 08:57:12 crc kubenswrapper[4822]: I1010 08:57:12.652213 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:57:12 crc kubenswrapper[4822]: E1010 08:57:12.652560 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.023964 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.155457 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ceph\") pod \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.155535 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ssh-key\") pod \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.155722 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-inventory\") pod \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.156259 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-agent-neutron-config-0\") pod \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.156345 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/718c7cdd-089d-42bc-8cab-74214e8ddeb3-kube-api-access-vv5q5\") pod \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.156600 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-combined-ca-bundle\") pod \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\" (UID: \"718c7cdd-089d-42bc-8cab-74214e8ddeb3\") " Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.160236 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ceph" (OuterVolumeSpecName: "ceph") pod "718c7cdd-089d-42bc-8cab-74214e8ddeb3" (UID: "718c7cdd-089d-42bc-8cab-74214e8ddeb3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.161307 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "718c7cdd-089d-42bc-8cab-74214e8ddeb3" (UID: "718c7cdd-089d-42bc-8cab-74214e8ddeb3"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.161523 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718c7cdd-089d-42bc-8cab-74214e8ddeb3-kube-api-access-vv5q5" (OuterVolumeSpecName: "kube-api-access-vv5q5") pod "718c7cdd-089d-42bc-8cab-74214e8ddeb3" (UID: "718c7cdd-089d-42bc-8cab-74214e8ddeb3"). InnerVolumeSpecName "kube-api-access-vv5q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.183966 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "718c7cdd-089d-42bc-8cab-74214e8ddeb3" (UID: "718c7cdd-089d-42bc-8cab-74214e8ddeb3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.184285 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-inventory" (OuterVolumeSpecName: "inventory") pod "718c7cdd-089d-42bc-8cab-74214e8ddeb3" (UID: "718c7cdd-089d-42bc-8cab-74214e8ddeb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.185538 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "718c7cdd-089d-42bc-8cab-74214e8ddeb3" (UID: "718c7cdd-089d-42bc-8cab-74214e8ddeb3"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.259251 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.259290 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.259302 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.259313 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.259323 4822 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/718c7cdd-089d-42bc-8cab-74214e8ddeb3-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.259333 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/718c7cdd-089d-42bc-8cab-74214e8ddeb3-kube-api-access-vv5q5\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.405314 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" event={"ID":"718c7cdd-089d-42bc-8cab-74214e8ddeb3","Type":"ContainerDied","Data":"ed616d1e3828c4e867d44f58a4d51d00a403e07c6eb81f7018680b9b052bc70e"} Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.405359 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed616d1e3828c4e867d44f58a4d51d00a403e07c6eb81f7018680b9b052bc70e" Oct 10 08:57:14 crc kubenswrapper[4822]: I1010 08:57:14.405462 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jb6k4" Oct 10 08:57:27 crc kubenswrapper[4822]: I1010 08:57:27.650843 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:57:27 crc kubenswrapper[4822]: E1010 08:57:27.651678 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:57:39 crc kubenswrapper[4822]: I1010 08:57:39.540778 4822 scope.go:117] "RemoveContainer" containerID="c723f289932630f636f7ab5046bf5ec63c0429ede9ba7cef9b900b08b8629863" Oct 10 08:57:39 crc kubenswrapper[4822]: I1010 08:57:39.584360 4822 scope.go:117] "RemoveContainer" containerID="b3dc8a50ec31868381b41576c99ee7bb7022b6c2ab996f49cff0e08c8f011516" Oct 10 08:57:40 crc kubenswrapper[4822]: I1010 08:57:40.003313 4822 scope.go:117] "RemoveContainer" containerID="423fbd0b7df6faa0d046423f38d041458c9dad9d81930bef6b2aec0c6f6ee0c4" Oct 10 08:57:41 crc kubenswrapper[4822]: I1010 08:57:41.650692 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:57:41 crc kubenswrapper[4822]: E1010 08:57:41.651623 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:57:45 crc kubenswrapper[4822]: I1010 08:57:45.907220 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:57:45 crc kubenswrapper[4822]: I1010 08:57:45.908057 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="82979957-20d2-4f04-8595-6ba826b061d9" containerName="nova-cell0-conductor-conductor" containerID="cri-o://fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8" gracePeriod=30 Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.701488 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.702511 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="09ade431-bbe8-404b-a690-4e1eb2c542f9" containerName="nova-cell1-conductor-conductor" containerID="cri-o://6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" gracePeriod=30 Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.821827 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.822069 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-log" containerID="cri-o://32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a" gracePeriod=30 Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.822215 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-api" containerID="cri-o://c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce" gracePeriod=30 Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.847949 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.848221 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-log" containerID="cri-o://a5cf0feb59f3bc799a7557ae0334acbfbe76ba7aff7c2d24c3a1350b2948dec9" gracePeriod=30 Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.848831 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-metadata" containerID="cri-o://11415ca17046e0e8a0dab09d26116dce4c62acf438787669e4a12a2fc0bb829c" gracePeriod=30 Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.864239 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:57:46 crc kubenswrapper[4822]: I1010 08:57:46.864495 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="006a06c2-ba4e-4aea-a817-73e66bd4720a" containerName="nova-scheduler-scheduler" containerID="cri-o://206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" gracePeriod=30 Oct 10 08:57:47 crc kubenswrapper[4822]: I1010 08:57:47.811724 4822 generic.go:334] "Generic (PLEG): container finished" podID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerID="a5cf0feb59f3bc799a7557ae0334acbfbe76ba7aff7c2d24c3a1350b2948dec9" exitCode=143 Oct 10 08:57:47 crc kubenswrapper[4822]: I1010 08:57:47.811784 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e146fc70-ffc2-40af-b67c-f636fa7019b6","Type":"ContainerDied","Data":"a5cf0feb59f3bc799a7557ae0334acbfbe76ba7aff7c2d24c3a1350b2948dec9"} Oct 10 08:57:47 crc kubenswrapper[4822]: I1010 08:57:47.817153 4822 generic.go:334] "Generic (PLEG): container finished" podID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerID="32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a" exitCode=143 Oct 10 08:57:47 crc kubenswrapper[4822]: I1010 08:57:47.817190 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"011a2ccc-0472-41e2-bf43-ecd546f26e67","Type":"ContainerDied","Data":"32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a"} Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.008937 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.011046 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.013083 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.013153 4822 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="006a06c2-ba4e-4aea-a817-73e66bd4720a" containerName="nova-scheduler-scheduler" Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.204759 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752 is running failed: container process not found" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.205647 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752 is running failed: container process not found" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.206006 4822 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752 is running failed: container process not found" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.206223 4822 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="09ade431-bbe8-404b-a690-4e1eb2c542f9" containerName="nova-cell1-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.555820 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.674330 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-combined-ca-bundle\") pod \"82979957-20d2-4f04-8595-6ba826b061d9\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.674489 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x566\" (UniqueName: \"kubernetes.io/projected/82979957-20d2-4f04-8595-6ba826b061d9-kube-api-access-8x566\") pod \"82979957-20d2-4f04-8595-6ba826b061d9\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.676310 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-config-data\") pod \"82979957-20d2-4f04-8595-6ba826b061d9\" (UID: \"82979957-20d2-4f04-8595-6ba826b061d9\") " Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.677288 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.683266 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82979957-20d2-4f04-8595-6ba826b061d9-kube-api-access-8x566" (OuterVolumeSpecName: "kube-api-access-8x566") pod "82979957-20d2-4f04-8595-6ba826b061d9" (UID: "82979957-20d2-4f04-8595-6ba826b061d9"). InnerVolumeSpecName "kube-api-access-8x566". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.718585 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82979957-20d2-4f04-8595-6ba826b061d9" (UID: "82979957-20d2-4f04-8595-6ba826b061d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.724037 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-config-data" (OuterVolumeSpecName: "config-data") pod "82979957-20d2-4f04-8595-6ba826b061d9" (UID: "82979957-20d2-4f04-8595-6ba826b061d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.778202 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-combined-ca-bundle\") pod \"09ade431-bbe8-404b-a690-4e1eb2c542f9\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.778265 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvkw\" (UniqueName: \"kubernetes.io/projected/09ade431-bbe8-404b-a690-4e1eb2c542f9-kube-api-access-wkvkw\") pod \"09ade431-bbe8-404b-a690-4e1eb2c542f9\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.778315 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-config-data\") pod \"09ade431-bbe8-404b-a690-4e1eb2c542f9\" (UID: \"09ade431-bbe8-404b-a690-4e1eb2c542f9\") " Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.779184 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.779202 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82979957-20d2-4f04-8595-6ba826b061d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.779213 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x566\" (UniqueName: \"kubernetes.io/projected/82979957-20d2-4f04-8595-6ba826b061d9-kube-api-access-8x566\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.784040 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ade431-bbe8-404b-a690-4e1eb2c542f9-kube-api-access-wkvkw" (OuterVolumeSpecName: "kube-api-access-wkvkw") pod "09ade431-bbe8-404b-a690-4e1eb2c542f9" (UID: "09ade431-bbe8-404b-a690-4e1eb2c542f9"). InnerVolumeSpecName "kube-api-access-wkvkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.807079 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-config-data" (OuterVolumeSpecName: "config-data") pod "09ade431-bbe8-404b-a690-4e1eb2c542f9" (UID: "09ade431-bbe8-404b-a690-4e1eb2c542f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.813115 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ade431-bbe8-404b-a690-4e1eb2c542f9" (UID: "09ade431-bbe8-404b-a690-4e1eb2c542f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.830463 4822 generic.go:334] "Generic (PLEG): container finished" podID="82979957-20d2-4f04-8595-6ba826b061d9" containerID="fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8" exitCode=0 Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.830535 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.830551 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"82979957-20d2-4f04-8595-6ba826b061d9","Type":"ContainerDied","Data":"fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8"} Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.830626 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"82979957-20d2-4f04-8595-6ba826b061d9","Type":"ContainerDied","Data":"62effe3d02f5ef935685212204980b0d2b80fd82b744df022116b133832b9970"} Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.830796 4822 scope.go:117] "RemoveContainer" containerID="fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.832651 4822 generic.go:334] "Generic (PLEG): container finished" podID="09ade431-bbe8-404b-a690-4e1eb2c542f9" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" exitCode=0 Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.832693 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.832703 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"09ade431-bbe8-404b-a690-4e1eb2c542f9","Type":"ContainerDied","Data":"6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752"} Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.832761 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"09ade431-bbe8-404b-a690-4e1eb2c542f9","Type":"ContainerDied","Data":"584d1c36666076a54220e36ce6faab93019edcd5b16bb7f5fc0ee1ecb2935319"} Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.877886 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.878397 4822 scope.go:117] "RemoveContainer" containerID="fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8" Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.880200 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8\": container with ID starting with fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8 not found: ID does not exist" containerID="fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.880265 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8"} err="failed to get container status \"fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8\": rpc error: code = NotFound desc = could not find container \"fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8\": container with ID starting with fe7161684d03c0dda1355e23adf5780c4f91a3f549352cee403b58450e0f56c8 not found: ID does not exist" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.880293 4822 scope.go:117] "RemoveContainer" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.894044 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.899045 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.902003 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.902040 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvkw\" (UniqueName: \"kubernetes.io/projected/09ade431-bbe8-404b-a690-4e1eb2c542f9-kube-api-access-wkvkw\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.902054 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ade431-bbe8-404b-a690-4e1eb2c542f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.911436 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.921214 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.921847 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ade431-bbe8-404b-a690-4e1eb2c542f9" containerName="nova-cell1-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.921883 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ade431-bbe8-404b-a690-4e1eb2c542f9" containerName="nova-cell1-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.921846 4822 scope.go:117] "RemoveContainer" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.921927 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82979957-20d2-4f04-8595-6ba826b061d9" containerName="nova-cell0-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.921936 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="82979957-20d2-4f04-8595-6ba826b061d9" containerName="nova-cell0-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.921945 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718c7cdd-089d-42bc-8cab-74214e8ddeb3" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.921952 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="718c7cdd-089d-42bc-8cab-74214e8ddeb3" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.922182 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="82979957-20d2-4f04-8595-6ba826b061d9" containerName="nova-cell0-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.922213 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ade431-bbe8-404b-a690-4e1eb2c542f9" containerName="nova-cell1-conductor-conductor" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.922243 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="718c7cdd-089d-42bc-8cab-74214e8ddeb3" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 10 08:57:48 crc kubenswrapper[4822]: E1010 08:57:48.922345 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752\": container with ID starting with 6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752 not found: ID does not exist" containerID="6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.922406 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752"} err="failed to get container status \"6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752\": rpc error: code = NotFound desc = could not find container \"6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752\": container with ID starting with 6b5c627f24e7eb35dab791c6236cca09af27d3134a9e59804f99369ee68d2752 not found: ID does not exist" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.923157 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.925987 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.944202 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.945716 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.948598 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.962091 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:57:48 crc kubenswrapper[4822]: I1010 08:57:48.979983 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.105704 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mm9t\" (UniqueName: \"kubernetes.io/projected/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-kube-api-access-2mm9t\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.106091 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.106116 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26937e57-4862-4167-b627-9711868b3b60-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.106178 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26937e57-4862-4167-b627-9711868b3b60-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.106241 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.107076 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xlq8\" (UniqueName: \"kubernetes.io/projected/26937e57-4862-4167-b627-9711868b3b60-kube-api-access-4xlq8\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.209762 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xlq8\" (UniqueName: \"kubernetes.io/projected/26937e57-4862-4167-b627-9711868b3b60-kube-api-access-4xlq8\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.209884 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm9t\" (UniqueName: \"kubernetes.io/projected/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-kube-api-access-2mm9t\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.210098 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.210119 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26937e57-4862-4167-b627-9711868b3b60-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.210960 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26937e57-4862-4167-b627-9711868b3b60-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.211204 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.215232 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26937e57-4862-4167-b627-9711868b3b60-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.216099 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.216321 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.217337 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26937e57-4862-4167-b627-9711868b3b60-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.226213 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xlq8\" (UniqueName: \"kubernetes.io/projected/26937e57-4862-4167-b627-9711868b3b60-kube-api-access-4xlq8\") pod \"nova-cell0-conductor-0\" (UID: \"26937e57-4862-4167-b627-9711868b3b60\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.234676 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mm9t\" (UniqueName: \"kubernetes.io/projected/a43abc0e-2683-43f5-ab5e-5687c8ecd71b-kube-api-access-2mm9t\") pod \"nova-cell1-conductor-0\" (UID: \"a43abc0e-2683-43f5-ab5e-5687c8ecd71b\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.245090 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.271472 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.662910 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ade431-bbe8-404b-a690-4e1eb2c542f9" path="/var/lib/kubelet/pods/09ade431-bbe8-404b-a690-4e1eb2c542f9/volumes" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.664834 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82979957-20d2-4f04-8595-6ba826b061d9" path="/var/lib/kubelet/pods/82979957-20d2-4f04-8595-6ba826b061d9/volumes" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.758120 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.832294 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:57:49 crc kubenswrapper[4822]: W1010 08:57:49.832297 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26937e57_4862_4167_b627_9711868b3b60.slice/crio-8e97671d36c218521d4c00d7806fcc9bd58e49c5b9f2bb133c061ee904da32be WatchSource:0}: Error finding container 8e97671d36c218521d4c00d7806fcc9bd58e49c5b9f2bb133c061ee904da32be: Status 404 returned error can't find the container with id 8e97671d36c218521d4c00d7806fcc9bd58e49c5b9f2bb133c061ee904da32be Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.844648 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26937e57-4862-4167-b627-9711868b3b60","Type":"ContainerStarted","Data":"8e97671d36c218521d4c00d7806fcc9bd58e49c5b9f2bb133c061ee904da32be"} Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.846629 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a43abc0e-2683-43f5-ab5e-5687c8ecd71b","Type":"ContainerStarted","Data":"f85d2bdc1675bc08ef48807b1335b52e08899fd340e2cd0019aa2507e8ad84d9"} Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.994852 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": read tcp 10.217.0.2:33114->10.217.1.84:8775: read: connection reset by peer" Oct 10 08:57:49 crc kubenswrapper[4822]: I1010 08:57:49.994929 4822 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": read tcp 10.217.0.2:33118->10.217.1.84:8775: read: connection reset by peer" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.484573 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.642869 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-combined-ca-bundle\") pod \"011a2ccc-0472-41e2-bf43-ecd546f26e67\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.642980 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-config-data\") pod \"011a2ccc-0472-41e2-bf43-ecd546f26e67\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.643032 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011a2ccc-0472-41e2-bf43-ecd546f26e67-logs\") pod \"011a2ccc-0472-41e2-bf43-ecd546f26e67\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.643226 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpsgj\" (UniqueName: \"kubernetes.io/projected/011a2ccc-0472-41e2-bf43-ecd546f26e67-kube-api-access-jpsgj\") pod \"011a2ccc-0472-41e2-bf43-ecd546f26e67\" (UID: \"011a2ccc-0472-41e2-bf43-ecd546f26e67\") " Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.644118 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011a2ccc-0472-41e2-bf43-ecd546f26e67-logs" (OuterVolumeSpecName: "logs") pod "011a2ccc-0472-41e2-bf43-ecd546f26e67" (UID: "011a2ccc-0472-41e2-bf43-ecd546f26e67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.746215 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/011a2ccc-0472-41e2-bf43-ecd546f26e67-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.890218 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26937e57-4862-4167-b627-9711868b3b60","Type":"ContainerStarted","Data":"a27a2d122092e101ebc0b433063748fc5af1486470bf180ab4e826dbe25bf80a"} Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.891478 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.892494 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a43abc0e-2683-43f5-ab5e-5687c8ecd71b","Type":"ContainerStarted","Data":"2b13048f29d7ae96dcdfb95bc866e578a42f5c75167fcda04afcc2488ba80b5b"} Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.893685 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.896427 4822 generic.go:334] "Generic (PLEG): container finished" podID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerID="c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce" exitCode=0 Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.896496 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.896553 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"011a2ccc-0472-41e2-bf43-ecd546f26e67","Type":"ContainerDied","Data":"c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce"} Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.896583 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"011a2ccc-0472-41e2-bf43-ecd546f26e67","Type":"ContainerDied","Data":"e0c4a2ed9388cce7e362b9275d5828d5c429e588d387eaa85789a860f5d48f9c"} Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.896627 4822 scope.go:117] "RemoveContainer" containerID="c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.904741 4822 generic.go:334] "Generic (PLEG): container finished" podID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerID="11415ca17046e0e8a0dab09d26116dce4c62acf438787669e4a12a2fc0bb829c" exitCode=0 Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.904783 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e146fc70-ffc2-40af-b67c-f636fa7019b6","Type":"ContainerDied","Data":"11415ca17046e0e8a0dab09d26116dce4c62acf438787669e4a12a2fc0bb829c"} Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.925903 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.925882123 podStartE2EDuration="2.925882123s" podCreationTimestamp="2025-10-10 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:57:50.904283775 +0000 UTC m=+9217.999441981" watchObservedRunningTime="2025-10-10 08:57:50.925882123 +0000 UTC m=+9218.021040319" Oct 10 08:57:50 crc kubenswrapper[4822]: I1010 08:57:50.933636 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.933618405 podStartE2EDuration="2.933618405s" podCreationTimestamp="2025-10-10 08:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:57:50.920369855 +0000 UTC m=+9218.015528071" watchObservedRunningTime="2025-10-10 08:57:50.933618405 +0000 UTC m=+9218.028776601" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.276408 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011a2ccc-0472-41e2-bf43-ecd546f26e67-kube-api-access-jpsgj" (OuterVolumeSpecName: "kube-api-access-jpsgj") pod "011a2ccc-0472-41e2-bf43-ecd546f26e67" (UID: "011a2ccc-0472-41e2-bf43-ecd546f26e67"). InnerVolumeSpecName "kube-api-access-jpsgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.290687 4822 scope.go:117] "RemoveContainer" containerID="32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.365839 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpsgj\" (UniqueName: \"kubernetes.io/projected/011a2ccc-0472-41e2-bf43-ecd546f26e67-kube-api-access-jpsgj\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.397013 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "011a2ccc-0472-41e2-bf43-ecd546f26e67" (UID: "011a2ccc-0472-41e2-bf43-ecd546f26e67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.400194 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-config-data" (OuterVolumeSpecName: "config-data") pod "011a2ccc-0472-41e2-bf43-ecd546f26e67" (UID: "011a2ccc-0472-41e2-bf43-ecd546f26e67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.467984 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.468017 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011a2ccc-0472-41e2-bf43-ecd546f26e67-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.489595 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.494872 4822 scope.go:117] "RemoveContainer" containerID="c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce" Oct 10 08:57:51 crc kubenswrapper[4822]: E1010 08:57:51.495344 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce\": container with ID starting with c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce not found: ID does not exist" containerID="c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.495375 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce"} err="failed to get container status \"c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce\": rpc error: code = NotFound desc = could not find container \"c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce\": container with ID starting with c79c413fb8aefbba85846c8071a93b40c34ea8779249c03bd8f1169f3a4222ce not found: ID does not exist" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.495393 4822 scope.go:117] "RemoveContainer" containerID="32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a" Oct 10 08:57:51 crc kubenswrapper[4822]: E1010 08:57:51.495776 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a\": container with ID starting with 32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a not found: ID does not exist" containerID="32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.495816 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a"} err="failed to get container status \"32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a\": rpc error: code = NotFound desc = could not find container \"32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a\": container with ID starting with 32bbf3fc729f9416da7b081ba730adf94d7d95d6562f83b50340d48af1cede9a not found: ID does not exist" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.548909 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.561391 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.573194 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 08:57:51 crc kubenswrapper[4822]: E1010 08:57:51.574854 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-log" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.574878 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-log" Oct 10 08:57:51 crc kubenswrapper[4822]: E1010 08:57:51.574935 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-log" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.574945 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-log" Oct 10 08:57:51 crc kubenswrapper[4822]: E1010 08:57:51.574968 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-metadata" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.574976 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-metadata" Oct 10 08:57:51 crc kubenswrapper[4822]: E1010 08:57:51.574995 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-api" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.575002 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-api" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.575292 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-api" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.575316 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" containerName="nova-api-log" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.575336 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-log" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.575356 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" containerName="nova-metadata-metadata" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.576977 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.580299 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.594278 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.661634 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011a2ccc-0472-41e2-bf43-ecd546f26e67" path="/var/lib/kubelet/pods/011a2ccc-0472-41e2-bf43-ecd546f26e67/volumes" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.672601 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e146fc70-ffc2-40af-b67c-f636fa7019b6-logs\") pod \"e146fc70-ffc2-40af-b67c-f636fa7019b6\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.672817 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-config-data\") pod \"e146fc70-ffc2-40af-b67c-f636fa7019b6\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.673530 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z9ck\" (UniqueName: \"kubernetes.io/projected/e146fc70-ffc2-40af-b67c-f636fa7019b6-kube-api-access-9z9ck\") pod \"e146fc70-ffc2-40af-b67c-f636fa7019b6\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.673841 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-combined-ca-bundle\") pod \"e146fc70-ffc2-40af-b67c-f636fa7019b6\" (UID: \"e146fc70-ffc2-40af-b67c-f636fa7019b6\") " Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.674907 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146fc70-ffc2-40af-b67c-f636fa7019b6-logs" (OuterVolumeSpecName: "logs") pod "e146fc70-ffc2-40af-b67c-f636fa7019b6" (UID: "e146fc70-ffc2-40af-b67c-f636fa7019b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.679933 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e146fc70-ffc2-40af-b67c-f636fa7019b6-kube-api-access-9z9ck" (OuterVolumeSpecName: "kube-api-access-9z9ck") pod "e146fc70-ffc2-40af-b67c-f636fa7019b6" (UID: "e146fc70-ffc2-40af-b67c-f636fa7019b6"). InnerVolumeSpecName "kube-api-access-9z9ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.705161 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-config-data" (OuterVolumeSpecName: "config-data") pod "e146fc70-ffc2-40af-b67c-f636fa7019b6" (UID: "e146fc70-ffc2-40af-b67c-f636fa7019b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.706656 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e146fc70-ffc2-40af-b67c-f636fa7019b6" (UID: "e146fc70-ffc2-40af-b67c-f636fa7019b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.777909 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706de7f2-6f04-48b5-9510-aed6e16b14e1-config-data\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.778237 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706de7f2-6f04-48b5-9510-aed6e16b14e1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.778713 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfjk\" (UniqueName: \"kubernetes.io/projected/706de7f2-6f04-48b5-9510-aed6e16b14e1-kube-api-access-vwfjk\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.781135 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/706de7f2-6f04-48b5-9510-aed6e16b14e1-logs\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.781207 4822 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e146fc70-ffc2-40af-b67c-f636fa7019b6-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.781218 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.781228 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z9ck\" (UniqueName: \"kubernetes.io/projected/e146fc70-ffc2-40af-b67c-f636fa7019b6-kube-api-access-9z9ck\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.781238 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e146fc70-ffc2-40af-b67c-f636fa7019b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.883335 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfjk\" (UniqueName: \"kubernetes.io/projected/706de7f2-6f04-48b5-9510-aed6e16b14e1-kube-api-access-vwfjk\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.883424 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/706de7f2-6f04-48b5-9510-aed6e16b14e1-logs\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.883471 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706de7f2-6f04-48b5-9510-aed6e16b14e1-config-data\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.883495 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706de7f2-6f04-48b5-9510-aed6e16b14e1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.884283 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/706de7f2-6f04-48b5-9510-aed6e16b14e1-logs\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.888925 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706de7f2-6f04-48b5-9510-aed6e16b14e1-config-data\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.890413 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706de7f2-6f04-48b5-9510-aed6e16b14e1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.908626 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfjk\" (UniqueName: \"kubernetes.io/projected/706de7f2-6f04-48b5-9510-aed6e16b14e1-kube-api-access-vwfjk\") pod \"nova-api-0\" (UID: \"706de7f2-6f04-48b5-9510-aed6e16b14e1\") " pod="openstack/nova-api-0" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.923827 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e146fc70-ffc2-40af-b67c-f636fa7019b6","Type":"ContainerDied","Data":"abf571fff4416e28175de14e4a4d0a92ad430d8b7dbd4dce0ac91197a4890528"} Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.924133 4822 scope.go:117] "RemoveContainer" containerID="11415ca17046e0e8a0dab09d26116dce4c62acf438787669e4a12a2fc0bb829c" Oct 10 08:57:51 crc kubenswrapper[4822]: I1010 08:57:51.924389 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.057755 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.068533 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.082109 4822 scope.go:117] "RemoveContainer" containerID="a5cf0feb59f3bc799a7557ae0334acbfbe76ba7aff7c2d24c3a1350b2948dec9" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.094377 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.096606 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.103504 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.113599 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.198891 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.291341 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21f845d-863d-4fb5-9303-13710403c771-logs\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.291678 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgccb\" (UniqueName: \"kubernetes.io/projected/a21f845d-863d-4fb5-9303-13710403c771-kube-api-access-jgccb\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.291775 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21f845d-863d-4fb5-9303-13710403c771-config-data\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.291834 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21f845d-863d-4fb5-9303-13710403c771-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.393623 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21f845d-863d-4fb5-9303-13710403c771-logs\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.393879 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgccb\" (UniqueName: \"kubernetes.io/projected/a21f845d-863d-4fb5-9303-13710403c771-kube-api-access-jgccb\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.393969 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21f845d-863d-4fb5-9303-13710403c771-config-data\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.394004 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21f845d-863d-4fb5-9303-13710403c771-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.397248 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21f845d-863d-4fb5-9303-13710403c771-logs\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.400438 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21f845d-863d-4fb5-9303-13710403c771-config-data\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.400507 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21f845d-863d-4fb5-9303-13710403c771-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.413876 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgccb\" (UniqueName: \"kubernetes.io/projected/a21f845d-863d-4fb5-9303-13710403c771-kube-api-access-jgccb\") pod \"nova-metadata-0\" (UID: \"a21f845d-863d-4fb5-9303-13710403c771\") " pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.419037 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.535219 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.699616 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-config-data\") pod \"006a06c2-ba4e-4aea-a817-73e66bd4720a\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.706359 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpm7g\" (UniqueName: \"kubernetes.io/projected/006a06c2-ba4e-4aea-a817-73e66bd4720a-kube-api-access-cpm7g\") pod \"006a06c2-ba4e-4aea-a817-73e66bd4720a\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.706527 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-combined-ca-bundle\") pod \"006a06c2-ba4e-4aea-a817-73e66bd4720a\" (UID: \"006a06c2-ba4e-4aea-a817-73e66bd4720a\") " Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.718050 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006a06c2-ba4e-4aea-a817-73e66bd4720a-kube-api-access-cpm7g" (OuterVolumeSpecName: "kube-api-access-cpm7g") pod "006a06c2-ba4e-4aea-a817-73e66bd4720a" (UID: "006a06c2-ba4e-4aea-a817-73e66bd4720a"). InnerVolumeSpecName "kube-api-access-cpm7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.722388 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.735561 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-config-data" (OuterVolumeSpecName: "config-data") pod "006a06c2-ba4e-4aea-a817-73e66bd4720a" (UID: "006a06c2-ba4e-4aea-a817-73e66bd4720a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.745194 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "006a06c2-ba4e-4aea-a817-73e66bd4720a" (UID: "006a06c2-ba4e-4aea-a817-73e66bd4720a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.809572 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpm7g\" (UniqueName: \"kubernetes.io/projected/006a06c2-ba4e-4aea-a817-73e66bd4720a-kube-api-access-cpm7g\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.809950 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.809960 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006a06c2-ba4e-4aea-a817-73e66bd4720a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.928920 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:57:52 crc kubenswrapper[4822]: W1010 08:57:52.929969 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda21f845d_863d_4fb5_9303_13710403c771.slice/crio-6390a77b67e04c4d20e087bcaa1d07bfbf91b4fd2b49a6dcc18c964ba3355994 WatchSource:0}: Error finding container 6390a77b67e04c4d20e087bcaa1d07bfbf91b4fd2b49a6dcc18c964ba3355994: Status 404 returned error can't find the container with id 6390a77b67e04c4d20e087bcaa1d07bfbf91b4fd2b49a6dcc18c964ba3355994 Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.935406 4822 generic.go:334] "Generic (PLEG): container finished" podID="006a06c2-ba4e-4aea-a817-73e66bd4720a" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" exitCode=0 Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.935476 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006a06c2-ba4e-4aea-a817-73e66bd4720a","Type":"ContainerDied","Data":"206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0"} Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.935503 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"006a06c2-ba4e-4aea-a817-73e66bd4720a","Type":"ContainerDied","Data":"168f62632dded33f662023c2b190a077037876ed9f509e3ddfd12865a2892974"} Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.935510 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.935521 4822 scope.go:117] "RemoveContainer" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.938311 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"706de7f2-6f04-48b5-9510-aed6e16b14e1","Type":"ContainerStarted","Data":"315f4849caf0770386fe0907250e6b836f6e43ae5cef0f13e1225421cc69012a"} Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.938358 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"706de7f2-6f04-48b5-9510-aed6e16b14e1","Type":"ContainerStarted","Data":"a8a9ef0f3cbce75fa15a949ef812a89a09962bd1743dcf16b86ee4bd121bcf3a"} Oct 10 08:57:52 crc kubenswrapper[4822]: I1010 08:57:52.982680 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.002346 4822 scope.go:117] "RemoveContainer" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.011769 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.023522 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:57:53 crc kubenswrapper[4822]: E1010 08:57:53.024049 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006a06c2-ba4e-4aea-a817-73e66bd4720a" containerName="nova-scheduler-scheduler" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.024069 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="006a06c2-ba4e-4aea-a817-73e66bd4720a" containerName="nova-scheduler-scheduler" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.024294 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="006a06c2-ba4e-4aea-a817-73e66bd4720a" containerName="nova-scheduler-scheduler" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.025158 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: E1010 08:57:53.029238 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0\": container with ID starting with 206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0 not found: ID does not exist" containerID="206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.029289 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0"} err="failed to get container status \"206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0\": rpc error: code = NotFound desc = could not find container \"206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0\": container with ID starting with 206754e0690e76a81e4a5f20aaf567a1b387c3e63dfdd985995062f90b12afe0 not found: ID does not exist" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.029562 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.055194 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.218199 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c61d8-b919-4131-a03a-6fe380018721-config-data\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.218285 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpmb\" (UniqueName: \"kubernetes.io/projected/fb0c61d8-b919-4131-a03a-6fe380018721-kube-api-access-4qpmb\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.218324 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c61d8-b919-4131-a03a-6fe380018721-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.320442 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c61d8-b919-4131-a03a-6fe380018721-config-data\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.320537 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpmb\" (UniqueName: \"kubernetes.io/projected/fb0c61d8-b919-4131-a03a-6fe380018721-kube-api-access-4qpmb\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.320575 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c61d8-b919-4131-a03a-6fe380018721-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.671617 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006a06c2-ba4e-4aea-a817-73e66bd4720a" path="/var/lib/kubelet/pods/006a06c2-ba4e-4aea-a817-73e66bd4720a/volumes" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.672793 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e146fc70-ffc2-40af-b67c-f636fa7019b6" path="/var/lib/kubelet/pods/e146fc70-ffc2-40af-b67c-f636fa7019b6/volumes" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.871770 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0c61d8-b919-4131-a03a-6fe380018721-config-data\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.872041 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c61d8-b919-4131-a03a-6fe380018721-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.872390 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpmb\" (UniqueName: \"kubernetes.io/projected/fb0c61d8-b919-4131-a03a-6fe380018721-kube-api-access-4qpmb\") pod \"nova-scheduler-0\" (UID: \"fb0c61d8-b919-4131-a03a-6fe380018721\") " pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.946395 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.956709 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a21f845d-863d-4fb5-9303-13710403c771","Type":"ContainerStarted","Data":"f6f5b7314b3466104a5dc0426c3367c1b0f4514f9b9c07c4330641c771b91102"} Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.956763 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a21f845d-863d-4fb5-9303-13710403c771","Type":"ContainerStarted","Data":"6390a77b67e04c4d20e087bcaa1d07bfbf91b4fd2b49a6dcc18c964ba3355994"} Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.961597 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"706de7f2-6f04-48b5-9510-aed6e16b14e1","Type":"ContainerStarted","Data":"147838878cd6d266aac3cec217460e40e57c8f319c473c6bbe9d5929566bcec5"} Oct 10 08:57:53 crc kubenswrapper[4822]: I1010 08:57:53.996625 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.996599168 podStartE2EDuration="2.996599168s" podCreationTimestamp="2025-10-10 08:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:57:53.995406804 +0000 UTC m=+9221.090565040" watchObservedRunningTime="2025-10-10 08:57:53.996599168 +0000 UTC m=+9221.091757394" Oct 10 08:57:54 crc kubenswrapper[4822]: I1010 08:57:54.492748 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:57:54 crc kubenswrapper[4822]: I1010 08:57:54.974339 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a21f845d-863d-4fb5-9303-13710403c771","Type":"ContainerStarted","Data":"e8cfd8a1a24c92b0d2a6694edf9ae12251b0f9b31761a60937e7e6c7e5b738b4"} Oct 10 08:57:54 crc kubenswrapper[4822]: I1010 08:57:54.978711 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb0c61d8-b919-4131-a03a-6fe380018721","Type":"ContainerStarted","Data":"83b9854900c7a05e3abf73a0c6be18192aadd2666b172a5dd539a2816a7a189c"} Oct 10 08:57:54 crc kubenswrapper[4822]: I1010 08:57:54.978754 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb0c61d8-b919-4131-a03a-6fe380018721","Type":"ContainerStarted","Data":"580a1e8f258be278ab31b35fb2d84991ceb8ab52c02f3ea6ad85c171d2763cd8"} Oct 10 08:57:54 crc kubenswrapper[4822]: I1010 08:57:54.996921 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.996892308 podStartE2EDuration="2.996892308s" podCreationTimestamp="2025-10-10 08:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:57:54.991034571 +0000 UTC m=+9222.086192767" watchObservedRunningTime="2025-10-10 08:57:54.996892308 +0000 UTC m=+9222.092050584" Oct 10 08:57:55 crc kubenswrapper[4822]: I1010 08:57:55.018920 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.018892819 podStartE2EDuration="3.018892819s" podCreationTimestamp="2025-10-10 08:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:57:55.007397919 +0000 UTC m=+9222.102556125" watchObservedRunningTime="2025-10-10 08:57:55.018892819 +0000 UTC m=+9222.114051055" Oct 10 08:57:56 crc kubenswrapper[4822]: I1010 08:57:56.650980 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:57:56 crc kubenswrapper[4822]: E1010 08:57:56.651754 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:57:57 crc kubenswrapper[4822]: I1010 08:57:57.419908 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:57:57 crc kubenswrapper[4822]: I1010 08:57:57.420225 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:57:58 crc kubenswrapper[4822]: I1010 08:57:58.947271 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 08:57:59 crc kubenswrapper[4822]: I1010 08:57:59.292780 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 08:57:59 crc kubenswrapper[4822]: I1010 08:57:59.328589 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 08:58:02 crc kubenswrapper[4822]: I1010 08:58:02.200009 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:58:02 crc kubenswrapper[4822]: I1010 08:58:02.200389 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:58:02 crc kubenswrapper[4822]: I1010 08:58:02.420000 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:58:02 crc kubenswrapper[4822]: I1010 08:58:02.420198 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:58:03 crc kubenswrapper[4822]: I1010 08:58:03.282047 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="706de7f2-6f04-48b5-9510-aed6e16b14e1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:58:03 crc kubenswrapper[4822]: I1010 08:58:03.282088 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="706de7f2-6f04-48b5-9510-aed6e16b14e1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:58:03 crc kubenswrapper[4822]: I1010 08:58:03.503133 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a21f845d-863d-4fb5-9303-13710403c771" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.196:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:58:03 crc kubenswrapper[4822]: I1010 08:58:03.503158 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a21f845d-863d-4fb5-9303-13710403c771" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.196:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:58:03 crc kubenswrapper[4822]: I1010 08:58:03.946524 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 08:58:04 crc kubenswrapper[4822]: I1010 08:58:04.102176 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 08:58:04 crc kubenswrapper[4822]: I1010 08:58:04.157046 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 08:58:09 crc kubenswrapper[4822]: I1010 08:58:09.655763 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:58:09 crc kubenswrapper[4822]: E1010 08:58:09.656719 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.203766 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.204559 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.209463 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.220924 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.423081 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.423405 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.425883 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 08:58:12 crc kubenswrapper[4822]: I1010 08:58:12.429029 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 08:58:13 crc kubenswrapper[4822]: I1010 08:58:13.181864 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:58:13 crc kubenswrapper[4822]: I1010 08:58:13.185759 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.531930 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl"] Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.534507 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.545477 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-pgjlt" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.545833 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.545890 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.546636 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.546790 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.546952 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.547066 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.559914 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl"] Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.609961 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610007 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610061 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610118 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610154 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610183 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610204 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610223 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610239 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610257 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.610321 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2qll\" (UniqueName: \"kubernetes.io/projected/1992f703-ffd6-47f1-ad78-44790cb9e0be-kube-api-access-d2qll\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.711994 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.712323 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.712898 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.712938 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713025 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713124 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713106 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713191 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713244 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2qll\" (UniqueName: \"kubernetes.io/projected/1992f703-ffd6-47f1-ad78-44790cb9e0be-kube-api-access-d2qll\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713348 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713390 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.713477 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.717184 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.717598 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.717756 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.718671 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.719719 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.726763 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.726839 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.726998 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.729624 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.745686 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2qll\" (UniqueName: \"kubernetes.io/projected/1992f703-ffd6-47f1-ad78-44790cb9e0be-kube-api-access-d2qll\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:14 crc kubenswrapper[4822]: I1010 08:58:14.870545 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 08:58:15 crc kubenswrapper[4822]: I1010 08:58:15.421669 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl"] Oct 10 08:58:15 crc kubenswrapper[4822]: W1010 08:58:15.438804 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1992f703_ffd6_47f1_ad78_44790cb9e0be.slice/crio-9171394c6d85bdc71cc1b63b6b42613609180d29a59f64f25d28ef45e1200fd8 WatchSource:0}: Error finding container 9171394c6d85bdc71cc1b63b6b42613609180d29a59f64f25d28ef45e1200fd8: Status 404 returned error can't find the container with id 9171394c6d85bdc71cc1b63b6b42613609180d29a59f64f25d28ef45e1200fd8 Oct 10 08:58:15 crc kubenswrapper[4822]: I1010 08:58:15.441849 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:58:16 crc kubenswrapper[4822]: I1010 08:58:16.222645 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" event={"ID":"1992f703-ffd6-47f1-ad78-44790cb9e0be","Type":"ContainerStarted","Data":"9171394c6d85bdc71cc1b63b6b42613609180d29a59f64f25d28ef45e1200fd8"} Oct 10 08:58:17 crc kubenswrapper[4822]: I1010 08:58:17.242311 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" event={"ID":"1992f703-ffd6-47f1-ad78-44790cb9e0be","Type":"ContainerStarted","Data":"6a8933f0e733a9bd8396375bb6d4e5336f8abb66b65e535d0d06a114524c9dd1"} Oct 10 08:58:17 crc kubenswrapper[4822]: I1010 08:58:17.276037 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" podStartSLOduration=2.669857069 podStartE2EDuration="3.276012441s" podCreationTimestamp="2025-10-10 08:58:14 +0000 UTC" firstStartedPulling="2025-10-10 08:58:15.441472734 +0000 UTC m=+9242.536630940" lastFinishedPulling="2025-10-10 08:58:16.047628116 +0000 UTC m=+9243.142786312" observedRunningTime="2025-10-10 08:58:17.273137048 +0000 UTC m=+9244.368295264" watchObservedRunningTime="2025-10-10 08:58:17.276012441 +0000 UTC m=+9244.371170697" Oct 10 08:58:24 crc kubenswrapper[4822]: I1010 08:58:24.650492 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:58:24 crc kubenswrapper[4822]: E1010 08:58:24.651578 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:58:38 crc kubenswrapper[4822]: I1010 08:58:38.651342 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:58:38 crc kubenswrapper[4822]: E1010 08:58:38.652595 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:58:53 crc kubenswrapper[4822]: I1010 08:58:53.666082 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:58:53 crc kubenswrapper[4822]: E1010 08:58:53.667340 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:59:06 crc kubenswrapper[4822]: I1010 08:59:06.651344 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:59:06 crc kubenswrapper[4822]: E1010 08:59:06.652466 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:59:20 crc kubenswrapper[4822]: I1010 08:59:20.651252 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:59:20 crc kubenswrapper[4822]: E1010 08:59:20.652097 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:59:33 crc kubenswrapper[4822]: I1010 08:59:33.664568 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:59:33 crc kubenswrapper[4822]: E1010 08:59:33.666085 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 08:59:47 crc kubenswrapper[4822]: I1010 08:59:47.650933 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 08:59:47 crc kubenswrapper[4822]: E1010 08:59:47.652251 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.170173 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5"] Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.173906 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.179290 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.179390 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.182830 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5"] Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.299914 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht6qz\" (UniqueName: \"kubernetes.io/projected/b39fcbb0-f166-4204-8b21-1fe44c65129f-kube-api-access-ht6qz\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.299981 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39fcbb0-f166-4204-8b21-1fe44c65129f-config-volume\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.300003 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39fcbb0-f166-4204-8b21-1fe44c65129f-secret-volume\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.402346 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht6qz\" (UniqueName: \"kubernetes.io/projected/b39fcbb0-f166-4204-8b21-1fe44c65129f-kube-api-access-ht6qz\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.402476 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39fcbb0-f166-4204-8b21-1fe44c65129f-config-volume\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.402512 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39fcbb0-f166-4204-8b21-1fe44c65129f-secret-volume\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.403930 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39fcbb0-f166-4204-8b21-1fe44c65129f-config-volume\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.424905 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39fcbb0-f166-4204-8b21-1fe44c65129f-secret-volume\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.426162 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht6qz\" (UniqueName: \"kubernetes.io/projected/b39fcbb0-f166-4204-8b21-1fe44c65129f-kube-api-access-ht6qz\") pod \"collect-profiles-29334780-vdkq5\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.498330 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:00 crc kubenswrapper[4822]: I1010 09:00:00.653543 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:00:00 crc kubenswrapper[4822]: E1010 09:00:00.654218 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:00:01 crc kubenswrapper[4822]: I1010 09:00:01.045021 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5"] Oct 10 09:00:01 crc kubenswrapper[4822]: I1010 09:00:01.550232 4822 generic.go:334] "Generic (PLEG): container finished" podID="b39fcbb0-f166-4204-8b21-1fe44c65129f" containerID="2780392697be4e76b612cb12d876ef93a05224598b245eb02cecfc777a71f21c" exitCode=0 Oct 10 09:00:01 crc kubenswrapper[4822]: I1010 09:00:01.550557 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" event={"ID":"b39fcbb0-f166-4204-8b21-1fe44c65129f","Type":"ContainerDied","Data":"2780392697be4e76b612cb12d876ef93a05224598b245eb02cecfc777a71f21c"} Oct 10 09:00:01 crc kubenswrapper[4822]: I1010 09:00:01.550597 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" event={"ID":"b39fcbb0-f166-4204-8b21-1fe44c65129f","Type":"ContainerStarted","Data":"ec3a234434b97d0bb13252548228462bbddc213a9e7b6600045f77eb51a7a437"} Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.731921 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.787840 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39fcbb0-f166-4204-8b21-1fe44c65129f-config-volume\") pod \"b39fcbb0-f166-4204-8b21-1fe44c65129f\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.787892 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39fcbb0-f166-4204-8b21-1fe44c65129f-secret-volume\") pod \"b39fcbb0-f166-4204-8b21-1fe44c65129f\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.787956 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht6qz\" (UniqueName: \"kubernetes.io/projected/b39fcbb0-f166-4204-8b21-1fe44c65129f-kube-api-access-ht6qz\") pod \"b39fcbb0-f166-4204-8b21-1fe44c65129f\" (UID: \"b39fcbb0-f166-4204-8b21-1fe44c65129f\") " Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.789714 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39fcbb0-f166-4204-8b21-1fe44c65129f-config-volume" (OuterVolumeSpecName: "config-volume") pod "b39fcbb0-f166-4204-8b21-1fe44c65129f" (UID: "b39fcbb0-f166-4204-8b21-1fe44c65129f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.795222 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39fcbb0-f166-4204-8b21-1fe44c65129f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b39fcbb0-f166-4204-8b21-1fe44c65129f" (UID: "b39fcbb0-f166-4204-8b21-1fe44c65129f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.796742 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39fcbb0-f166-4204-8b21-1fe44c65129f-kube-api-access-ht6qz" (OuterVolumeSpecName: "kube-api-access-ht6qz") pod "b39fcbb0-f166-4204-8b21-1fe44c65129f" (UID: "b39fcbb0-f166-4204-8b21-1fe44c65129f"). InnerVolumeSpecName "kube-api-access-ht6qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.891757 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39fcbb0-f166-4204-8b21-1fe44c65129f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.891846 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39fcbb0-f166-4204-8b21-1fe44c65129f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:00:03 crc kubenswrapper[4822]: I1010 09:00:03.891880 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht6qz\" (UniqueName: \"kubernetes.io/projected/b39fcbb0-f166-4204-8b21-1fe44c65129f-kube-api-access-ht6qz\") on node \"crc\" DevicePath \"\"" Oct 10 09:00:04 crc kubenswrapper[4822]: I1010 09:00:04.601788 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" event={"ID":"b39fcbb0-f166-4204-8b21-1fe44c65129f","Type":"ContainerDied","Data":"ec3a234434b97d0bb13252548228462bbddc213a9e7b6600045f77eb51a7a437"} Oct 10 09:00:04 crc kubenswrapper[4822]: I1010 09:00:04.602264 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec3a234434b97d0bb13252548228462bbddc213a9e7b6600045f77eb51a7a437" Oct 10 09:00:04 crc kubenswrapper[4822]: I1010 09:00:04.601910 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-vdkq5" Oct 10 09:00:04 crc kubenswrapper[4822]: I1010 09:00:04.817656 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h"] Oct 10 09:00:04 crc kubenswrapper[4822]: I1010 09:00:04.832670 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-6hh9h"] Oct 10 09:00:05 crc kubenswrapper[4822]: I1010 09:00:05.677069 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d817ef48-a80c-4a94-9ca9-55cf1cfd2f97" path="/var/lib/kubelet/pods/d817ef48-a80c-4a94-9ca9-55cf1cfd2f97/volumes" Oct 10 09:00:12 crc kubenswrapper[4822]: I1010 09:00:12.650533 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:00:12 crc kubenswrapper[4822]: E1010 09:00:12.651562 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:00:23 crc kubenswrapper[4822]: I1010 09:00:23.661197 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:00:23 crc kubenswrapper[4822]: E1010 09:00:23.662331 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:00:37 crc kubenswrapper[4822]: I1010 09:00:37.650729 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:00:37 crc kubenswrapper[4822]: E1010 09:00:37.651745 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:00:40 crc kubenswrapper[4822]: I1010 09:00:40.279494 4822 scope.go:117] "RemoveContainer" containerID="85d5514f4ece88ea1fe1334b2838c59e14c4c26d3accbde0d6c07158d0b271e0" Oct 10 09:00:51 crc kubenswrapper[4822]: I1010 09:00:51.651079 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:00:51 crc kubenswrapper[4822]: E1010 09:00:51.651932 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.693563 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xjpsf"] Oct 10 09:00:54 crc kubenswrapper[4822]: E1010 09:00:54.694521 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39fcbb0-f166-4204-8b21-1fe44c65129f" containerName="collect-profiles" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.694532 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39fcbb0-f166-4204-8b21-1fe44c65129f" containerName="collect-profiles" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.694744 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39fcbb0-f166-4204-8b21-1fe44c65129f" containerName="collect-profiles" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.696252 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.723321 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjpsf"] Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.800185 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-utilities\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.800258 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-catalog-content\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.800547 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9plz\" (UniqueName: \"kubernetes.io/projected/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-kube-api-access-m9plz\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.902400 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-utilities\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.902441 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-catalog-content\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.902502 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9plz\" (UniqueName: \"kubernetes.io/projected/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-kube-api-access-m9plz\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.902901 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-utilities\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.903257 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-catalog-content\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:54 crc kubenswrapper[4822]: I1010 09:00:54.927630 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9plz\" (UniqueName: \"kubernetes.io/projected/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-kube-api-access-m9plz\") pod \"certified-operators-xjpsf\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:55 crc kubenswrapper[4822]: I1010 09:00:55.025323 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:00:55 crc kubenswrapper[4822]: I1010 09:00:55.569200 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjpsf"] Oct 10 09:00:56 crc kubenswrapper[4822]: I1010 09:00:56.308833 4822 generic.go:334] "Generic (PLEG): container finished" podID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerID="91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56" exitCode=0 Oct 10 09:00:56 crc kubenswrapper[4822]: I1010 09:00:56.308942 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjpsf" event={"ID":"e8f4ee81-45a2-4d63-8f74-6d4df800ef17","Type":"ContainerDied","Data":"91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56"} Oct 10 09:00:56 crc kubenswrapper[4822]: I1010 09:00:56.309218 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjpsf" event={"ID":"e8f4ee81-45a2-4d63-8f74-6d4df800ef17","Type":"ContainerStarted","Data":"34af22f71700f1fb1f5d5a2af9611ce30bec0719e2eaf7f02117c21b6818c003"} Oct 10 09:00:58 crc kubenswrapper[4822]: I1010 09:00:58.335326 4822 generic.go:334] "Generic (PLEG): container finished" podID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerID="ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf" exitCode=0 Oct 10 09:00:58 crc kubenswrapper[4822]: I1010 09:00:58.335902 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjpsf" event={"ID":"e8f4ee81-45a2-4d63-8f74-6d4df800ef17","Type":"ContainerDied","Data":"ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf"} Oct 10 09:00:59 crc kubenswrapper[4822]: I1010 09:00:59.353976 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjpsf" event={"ID":"e8f4ee81-45a2-4d63-8f74-6d4df800ef17","Type":"ContainerStarted","Data":"1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146"} Oct 10 09:00:59 crc kubenswrapper[4822]: I1010 09:00:59.368347 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xjpsf" podStartSLOduration=2.685920183 podStartE2EDuration="5.368326615s" podCreationTimestamp="2025-10-10 09:00:54 +0000 UTC" firstStartedPulling="2025-10-10 09:00:56.318871709 +0000 UTC m=+9403.414029915" lastFinishedPulling="2025-10-10 09:00:59.001278151 +0000 UTC m=+9406.096436347" observedRunningTime="2025-10-10 09:00:59.367362267 +0000 UTC m=+9406.462520483" watchObservedRunningTime="2025-10-10 09:00:59.368326615 +0000 UTC m=+9406.463484811" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.167726 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29334781-c8lnk"] Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.170193 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.177037 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334781-c8lnk"] Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.332536 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-combined-ca-bundle\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.332580 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqr58\" (UniqueName: \"kubernetes.io/projected/33c77dae-c2ca-4c36-8d42-11478062ad05-kube-api-access-gqr58\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.332617 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-config-data\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.333152 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-fernet-keys\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.435023 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-fernet-keys\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.435831 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-combined-ca-bundle\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.435875 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqr58\" (UniqueName: \"kubernetes.io/projected/33c77dae-c2ca-4c36-8d42-11478062ad05-kube-api-access-gqr58\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:00 crc kubenswrapper[4822]: I1010 09:01:00.435933 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-config-data\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:01 crc kubenswrapper[4822]: I1010 09:01:01.071783 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-fernet-keys\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:01 crc kubenswrapper[4822]: I1010 09:01:01.072221 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-combined-ca-bundle\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:01 crc kubenswrapper[4822]: I1010 09:01:01.072308 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-config-data\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:01 crc kubenswrapper[4822]: I1010 09:01:01.074267 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqr58\" (UniqueName: \"kubernetes.io/projected/33c77dae-c2ca-4c36-8d42-11478062ad05-kube-api-access-gqr58\") pod \"keystone-cron-29334781-c8lnk\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:01 crc kubenswrapper[4822]: I1010 09:01:01.093887 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:01 crc kubenswrapper[4822]: I1010 09:01:01.574241 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334781-c8lnk"] Oct 10 09:01:01 crc kubenswrapper[4822]: W1010 09:01:01.578499 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33c77dae_c2ca_4c36_8d42_11478062ad05.slice/crio-c0971c0e7b620708c54d1cdaad94074ed69983a31b7cefc533d7ced968346798 WatchSource:0}: Error finding container c0971c0e7b620708c54d1cdaad94074ed69983a31b7cefc533d7ced968346798: Status 404 returned error can't find the container with id c0971c0e7b620708c54d1cdaad94074ed69983a31b7cefc533d7ced968346798 Oct 10 09:01:02 crc kubenswrapper[4822]: I1010 09:01:02.417120 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-c8lnk" event={"ID":"33c77dae-c2ca-4c36-8d42-11478062ad05","Type":"ContainerStarted","Data":"57e578203cc2628989561f2d5c4b4185c199412cd6930ea60600b7768910928a"} Oct 10 09:01:02 crc kubenswrapper[4822]: I1010 09:01:02.417451 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-c8lnk" event={"ID":"33c77dae-c2ca-4c36-8d42-11478062ad05","Type":"ContainerStarted","Data":"c0971c0e7b620708c54d1cdaad94074ed69983a31b7cefc533d7ced968346798"} Oct 10 09:01:02 crc kubenswrapper[4822]: I1010 09:01:02.456990 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29334781-c8lnk" podStartSLOduration=2.456968483 podStartE2EDuration="2.456968483s" podCreationTimestamp="2025-10-10 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:01:02.442763586 +0000 UTC m=+9409.537921782" watchObservedRunningTime="2025-10-10 09:01:02.456968483 +0000 UTC m=+9409.552126689" Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.026155 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.032612 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.107438 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.460126 4822 generic.go:334] "Generic (PLEG): container finished" podID="33c77dae-c2ca-4c36-8d42-11478062ad05" containerID="57e578203cc2628989561f2d5c4b4185c199412cd6930ea60600b7768910928a" exitCode=0 Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.461276 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-c8lnk" event={"ID":"33c77dae-c2ca-4c36-8d42-11478062ad05","Type":"ContainerDied","Data":"57e578203cc2628989561f2d5c4b4185c199412cd6930ea60600b7768910928a"} Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.513070 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:01:05 crc kubenswrapper[4822]: I1010 09:01:05.577749 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjpsf"] Oct 10 09:01:06 crc kubenswrapper[4822]: I1010 09:01:06.650408 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:01:06 crc kubenswrapper[4822]: E1010 09:01:06.650990 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:01:06 crc kubenswrapper[4822]: I1010 09:01:06.958349 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.111236 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-combined-ca-bundle\") pod \"33c77dae-c2ca-4c36-8d42-11478062ad05\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.111522 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqr58\" (UniqueName: \"kubernetes.io/projected/33c77dae-c2ca-4c36-8d42-11478062ad05-kube-api-access-gqr58\") pod \"33c77dae-c2ca-4c36-8d42-11478062ad05\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.111641 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-fernet-keys\") pod \"33c77dae-c2ca-4c36-8d42-11478062ad05\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.111697 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-config-data\") pod \"33c77dae-c2ca-4c36-8d42-11478062ad05\" (UID: \"33c77dae-c2ca-4c36-8d42-11478062ad05\") " Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.117305 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "33c77dae-c2ca-4c36-8d42-11478062ad05" (UID: "33c77dae-c2ca-4c36-8d42-11478062ad05"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.121445 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c77dae-c2ca-4c36-8d42-11478062ad05-kube-api-access-gqr58" (OuterVolumeSpecName: "kube-api-access-gqr58") pod "33c77dae-c2ca-4c36-8d42-11478062ad05" (UID: "33c77dae-c2ca-4c36-8d42-11478062ad05"). InnerVolumeSpecName "kube-api-access-gqr58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.149414 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33c77dae-c2ca-4c36-8d42-11478062ad05" (UID: "33c77dae-c2ca-4c36-8d42-11478062ad05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.212051 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-config-data" (OuterVolumeSpecName: "config-data") pod "33c77dae-c2ca-4c36-8d42-11478062ad05" (UID: "33c77dae-c2ca-4c36-8d42-11478062ad05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.214911 4822 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.214959 4822 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.214972 4822 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c77dae-c2ca-4c36-8d42-11478062ad05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.214987 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqr58\" (UniqueName: \"kubernetes.io/projected/33c77dae-c2ca-4c36-8d42-11478062ad05-kube-api-access-gqr58\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.519418 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-c8lnk" event={"ID":"33c77dae-c2ca-4c36-8d42-11478062ad05","Type":"ContainerDied","Data":"c0971c0e7b620708c54d1cdaad94074ed69983a31b7cefc533d7ced968346798"} Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.519846 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0971c0e7b620708c54d1cdaad94074ed69983a31b7cefc533d7ced968346798" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.519477 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-c8lnk" Oct 10 09:01:07 crc kubenswrapper[4822]: I1010 09:01:07.520154 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xjpsf" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="registry-server" containerID="cri-o://1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146" gracePeriod=2 Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.119848 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.238751 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-catalog-content\") pod \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.239042 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-utilities\") pod \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.239131 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9plz\" (UniqueName: \"kubernetes.io/projected/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-kube-api-access-m9plz\") pod \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\" (UID: \"e8f4ee81-45a2-4d63-8f74-6d4df800ef17\") " Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.239912 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-utilities" (OuterVolumeSpecName: "utilities") pod "e8f4ee81-45a2-4d63-8f74-6d4df800ef17" (UID: "e8f4ee81-45a2-4d63-8f74-6d4df800ef17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.243919 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-kube-api-access-m9plz" (OuterVolumeSpecName: "kube-api-access-m9plz") pod "e8f4ee81-45a2-4d63-8f74-6d4df800ef17" (UID: "e8f4ee81-45a2-4d63-8f74-6d4df800ef17"). InnerVolumeSpecName "kube-api-access-m9plz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.281276 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8f4ee81-45a2-4d63-8f74-6d4df800ef17" (UID: "e8f4ee81-45a2-4d63-8f74-6d4df800ef17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.341662 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.341701 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9plz\" (UniqueName: \"kubernetes.io/projected/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-kube-api-access-m9plz\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.341715 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f4ee81-45a2-4d63-8f74-6d4df800ef17-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.535419 4822 generic.go:334] "Generic (PLEG): container finished" podID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerID="1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146" exitCode=0 Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.535489 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjpsf" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.535521 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjpsf" event={"ID":"e8f4ee81-45a2-4d63-8f74-6d4df800ef17","Type":"ContainerDied","Data":"1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146"} Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.535867 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjpsf" event={"ID":"e8f4ee81-45a2-4d63-8f74-6d4df800ef17","Type":"ContainerDied","Data":"34af22f71700f1fb1f5d5a2af9611ce30bec0719e2eaf7f02117c21b6818c003"} Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.535893 4822 scope.go:117] "RemoveContainer" containerID="1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.574476 4822 scope.go:117] "RemoveContainer" containerID="ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.584528 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjpsf"] Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.601629 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xjpsf"] Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.614088 4822 scope.go:117] "RemoveContainer" containerID="91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.664291 4822 scope.go:117] "RemoveContainer" containerID="1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146" Oct 10 09:01:08 crc kubenswrapper[4822]: E1010 09:01:08.665465 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146\": container with ID starting with 1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146 not found: ID does not exist" containerID="1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.665519 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146"} err="failed to get container status \"1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146\": rpc error: code = NotFound desc = could not find container \"1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146\": container with ID starting with 1bbcf17ba51b5864ad26d8e82deaf0bf93f1351ab8acbf66f7d107e0f2cce146 not found: ID does not exist" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.665553 4822 scope.go:117] "RemoveContainer" containerID="ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf" Oct 10 09:01:08 crc kubenswrapper[4822]: E1010 09:01:08.666206 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf\": container with ID starting with ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf not found: ID does not exist" containerID="ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.666298 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf"} err="failed to get container status \"ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf\": rpc error: code = NotFound desc = could not find container \"ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf\": container with ID starting with ea13909e78cbd1c9597b8cacca382f1de018b8cdba15919f2ab358e7bbe450bf not found: ID does not exist" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.666359 4822 scope.go:117] "RemoveContainer" containerID="91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56" Oct 10 09:01:08 crc kubenswrapper[4822]: E1010 09:01:08.666862 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56\": container with ID starting with 91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56 not found: ID does not exist" containerID="91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56" Oct 10 09:01:08 crc kubenswrapper[4822]: I1010 09:01:08.666893 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56"} err="failed to get container status \"91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56\": rpc error: code = NotFound desc = could not find container \"91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56\": container with ID starting with 91700cc8f6603fd0d68ea7b82798b66309cfea702523017439ac804d0315db56 not found: ID does not exist" Oct 10 09:01:09 crc kubenswrapper[4822]: I1010 09:01:09.672367 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" path="/var/lib/kubelet/pods/e8f4ee81-45a2-4d63-8f74-6d4df800ef17/volumes" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.203109 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b8l97"] Oct 10 09:01:11 crc kubenswrapper[4822]: E1010 09:01:11.204492 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="extract-utilities" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.204526 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="extract-utilities" Oct 10 09:01:11 crc kubenswrapper[4822]: E1010 09:01:11.204585 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="extract-content" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.204605 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="extract-content" Oct 10 09:01:11 crc kubenswrapper[4822]: E1010 09:01:11.204657 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c77dae-c2ca-4c36-8d42-11478062ad05" containerName="keystone-cron" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.204675 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c77dae-c2ca-4c36-8d42-11478062ad05" containerName="keystone-cron" Oct 10 09:01:11 crc kubenswrapper[4822]: E1010 09:01:11.204718 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="registry-server" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.204737 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="registry-server" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.205343 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f4ee81-45a2-4d63-8f74-6d4df800ef17" containerName="registry-server" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.205394 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c77dae-c2ca-4c36-8d42-11478062ad05" containerName="keystone-cron" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.210942 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.221957 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8l97"] Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.318252 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrk5\" (UniqueName: \"kubernetes.io/projected/69954b9c-bb73-42e4-ac80-bd2b381aad30-kube-api-access-2vrk5\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.318314 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-utilities\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.318339 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-catalog-content\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.420657 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrk5\" (UniqueName: \"kubernetes.io/projected/69954b9c-bb73-42e4-ac80-bd2b381aad30-kube-api-access-2vrk5\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.420722 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-utilities\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.420743 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-catalog-content\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.421265 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-utilities\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.421300 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-catalog-content\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.788696 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrk5\" (UniqueName: \"kubernetes.io/projected/69954b9c-bb73-42e4-ac80-bd2b381aad30-kube-api-access-2vrk5\") pod \"community-operators-b8l97\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:11 crc kubenswrapper[4822]: I1010 09:01:11.852796 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:12 crc kubenswrapper[4822]: I1010 09:01:12.334330 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8l97"] Oct 10 09:01:12 crc kubenswrapper[4822]: I1010 09:01:12.580680 4822 generic.go:334] "Generic (PLEG): container finished" podID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerID="83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7" exitCode=0 Oct 10 09:01:12 crc kubenswrapper[4822]: I1010 09:01:12.580906 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8l97" event={"ID":"69954b9c-bb73-42e4-ac80-bd2b381aad30","Type":"ContainerDied","Data":"83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7"} Oct 10 09:01:12 crc kubenswrapper[4822]: I1010 09:01:12.581044 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8l97" event={"ID":"69954b9c-bb73-42e4-ac80-bd2b381aad30","Type":"ContainerStarted","Data":"7bbc357f46b75fa100b232faa0ef8a4efc9d799af150132b0d348559ac557881"} Oct 10 09:01:15 crc kubenswrapper[4822]: I1010 09:01:15.636837 4822 generic.go:334] "Generic (PLEG): container finished" podID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerID="bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59" exitCode=0 Oct 10 09:01:15 crc kubenswrapper[4822]: I1010 09:01:15.636894 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8l97" event={"ID":"69954b9c-bb73-42e4-ac80-bd2b381aad30","Type":"ContainerDied","Data":"bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59"} Oct 10 09:01:17 crc kubenswrapper[4822]: I1010 09:01:17.666478 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8l97" event={"ID":"69954b9c-bb73-42e4-ac80-bd2b381aad30","Type":"ContainerStarted","Data":"ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff"} Oct 10 09:01:17 crc kubenswrapper[4822]: I1010 09:01:17.693830 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b8l97" podStartSLOduration=2.247654421 podStartE2EDuration="6.693788575s" podCreationTimestamp="2025-10-10 09:01:11 +0000 UTC" firstStartedPulling="2025-10-10 09:01:12.583838537 +0000 UTC m=+9419.678996743" lastFinishedPulling="2025-10-10 09:01:17.029972661 +0000 UTC m=+9424.125130897" observedRunningTime="2025-10-10 09:01:17.686955719 +0000 UTC m=+9424.782113955" watchObservedRunningTime="2025-10-10 09:01:17.693788575 +0000 UTC m=+9424.788946771" Oct 10 09:01:20 crc kubenswrapper[4822]: I1010 09:01:20.652067 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:01:20 crc kubenswrapper[4822]: E1010 09:01:20.653345 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:01:21 crc kubenswrapper[4822]: I1010 09:01:21.853838 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:21 crc kubenswrapper[4822]: I1010 09:01:21.853933 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:21 crc kubenswrapper[4822]: I1010 09:01:21.955384 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:22 crc kubenswrapper[4822]: I1010 09:01:22.818573 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:23 crc kubenswrapper[4822]: I1010 09:01:23.771245 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b8l97"] Oct 10 09:01:24 crc kubenswrapper[4822]: I1010 09:01:24.749691 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b8l97" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="registry-server" containerID="cri-o://ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff" gracePeriod=2 Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.324777 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.366136 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrk5\" (UniqueName: \"kubernetes.io/projected/69954b9c-bb73-42e4-ac80-bd2b381aad30-kube-api-access-2vrk5\") pod \"69954b9c-bb73-42e4-ac80-bd2b381aad30\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.366261 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-utilities\") pod \"69954b9c-bb73-42e4-ac80-bd2b381aad30\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.366429 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-catalog-content\") pod \"69954b9c-bb73-42e4-ac80-bd2b381aad30\" (UID: \"69954b9c-bb73-42e4-ac80-bd2b381aad30\") " Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.367235 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-utilities" (OuterVolumeSpecName: "utilities") pod "69954b9c-bb73-42e4-ac80-bd2b381aad30" (UID: "69954b9c-bb73-42e4-ac80-bd2b381aad30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.375066 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69954b9c-bb73-42e4-ac80-bd2b381aad30-kube-api-access-2vrk5" (OuterVolumeSpecName: "kube-api-access-2vrk5") pod "69954b9c-bb73-42e4-ac80-bd2b381aad30" (UID: "69954b9c-bb73-42e4-ac80-bd2b381aad30"). InnerVolumeSpecName "kube-api-access-2vrk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.424072 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69954b9c-bb73-42e4-ac80-bd2b381aad30" (UID: "69954b9c-bb73-42e4-ac80-bd2b381aad30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.468505 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.468543 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vrk5\" (UniqueName: \"kubernetes.io/projected/69954b9c-bb73-42e4-ac80-bd2b381aad30-kube-api-access-2vrk5\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.468554 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69954b9c-bb73-42e4-ac80-bd2b381aad30-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.767436 4822 generic.go:334] "Generic (PLEG): container finished" podID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerID="ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff" exitCode=0 Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.767503 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8l97" event={"ID":"69954b9c-bb73-42e4-ac80-bd2b381aad30","Type":"ContainerDied","Data":"ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff"} Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.767548 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8l97" event={"ID":"69954b9c-bb73-42e4-ac80-bd2b381aad30","Type":"ContainerDied","Data":"7bbc357f46b75fa100b232faa0ef8a4efc9d799af150132b0d348559ac557881"} Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.767575 4822 scope.go:117] "RemoveContainer" containerID="ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.767574 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8l97" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.803229 4822 scope.go:117] "RemoveContainer" containerID="bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.809965 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b8l97"] Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.827181 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b8l97"] Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.851093 4822 scope.go:117] "RemoveContainer" containerID="83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.945715 4822 scope.go:117] "RemoveContainer" containerID="ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff" Oct 10 09:01:25 crc kubenswrapper[4822]: E1010 09:01:25.946432 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff\": container with ID starting with ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff not found: ID does not exist" containerID="ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.946495 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff"} err="failed to get container status \"ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff\": rpc error: code = NotFound desc = could not find container \"ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff\": container with ID starting with ab4bbfe6b2a5f20b5c9fecee07829c9d59c61c4562dd77756792cd631651e4ff not found: ID does not exist" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.946534 4822 scope.go:117] "RemoveContainer" containerID="bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59" Oct 10 09:01:25 crc kubenswrapper[4822]: E1010 09:01:25.947213 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59\": container with ID starting with bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59 not found: ID does not exist" containerID="bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.947242 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59"} err="failed to get container status \"bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59\": rpc error: code = NotFound desc = could not find container \"bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59\": container with ID starting with bcd20017c49d7158c171ab9305aa5c47a1536c5521adc47f5a42f015a123ba59 not found: ID does not exist" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.947261 4822 scope.go:117] "RemoveContainer" containerID="83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7" Oct 10 09:01:25 crc kubenswrapper[4822]: E1010 09:01:25.947894 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7\": container with ID starting with 83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7 not found: ID does not exist" containerID="83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7" Oct 10 09:01:25 crc kubenswrapper[4822]: I1010 09:01:25.947968 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7"} err="failed to get container status \"83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7\": rpc error: code = NotFound desc = could not find container \"83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7\": container with ID starting with 83f4cff3d712a433a57172011cec6af336e545bb9bab3865da40e8dfe45931c7 not found: ID does not exist" Oct 10 09:01:27 crc kubenswrapper[4822]: I1010 09:01:27.673631 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" path="/var/lib/kubelet/pods/69954b9c-bb73-42e4-ac80-bd2b381aad30/volumes" Oct 10 09:01:34 crc kubenswrapper[4822]: I1010 09:01:34.650782 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:01:34 crc kubenswrapper[4822]: E1010 09:01:34.651595 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.917106 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9sr24"] Oct 10 09:01:39 crc kubenswrapper[4822]: E1010 09:01:39.918483 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="extract-content" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.918508 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="extract-content" Oct 10 09:01:39 crc kubenswrapper[4822]: E1010 09:01:39.918531 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="extract-utilities" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.918544 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="extract-utilities" Oct 10 09:01:39 crc kubenswrapper[4822]: E1010 09:01:39.918612 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="registry-server" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.918623 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="registry-server" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.919054 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="69954b9c-bb73-42e4-ac80-bd2b381aad30" containerName="registry-server" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.921929 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.926708 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-utilities\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.926944 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7gp\" (UniqueName: \"kubernetes.io/projected/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-kube-api-access-4n7gp\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.927026 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-catalog-content\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:39 crc kubenswrapper[4822]: I1010 09:01:39.932739 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9sr24"] Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.028092 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7gp\" (UniqueName: \"kubernetes.io/projected/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-kube-api-access-4n7gp\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.028183 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-catalog-content\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.028325 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-utilities\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.028928 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-utilities\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.028938 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-catalog-content\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.050521 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7gp\" (UniqueName: \"kubernetes.io/projected/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-kube-api-access-4n7gp\") pod \"redhat-operators-9sr24\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.258344 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.709061 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9sr24"] Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.972346 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerStarted","Data":"694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220"} Oct 10 09:01:40 crc kubenswrapper[4822]: I1010 09:01:40.972388 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerStarted","Data":"90e72e05950d3ac93b8c4eccf9d5ff8759afa92b941c02cf7f410d21acd5f688"} Oct 10 09:01:41 crc kubenswrapper[4822]: I1010 09:01:41.991206 4822 generic.go:334] "Generic (PLEG): container finished" podID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerID="694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220" exitCode=0 Oct 10 09:01:41 crc kubenswrapper[4822]: I1010 09:01:41.992699 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerDied","Data":"694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220"} Oct 10 09:01:43 crc kubenswrapper[4822]: I1010 09:01:43.007576 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerStarted","Data":"0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15"} Oct 10 09:01:45 crc kubenswrapper[4822]: I1010 09:01:45.041099 4822 generic.go:334] "Generic (PLEG): container finished" podID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerID="0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15" exitCode=0 Oct 10 09:01:45 crc kubenswrapper[4822]: I1010 09:01:45.041171 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerDied","Data":"0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15"} Oct 10 09:01:46 crc kubenswrapper[4822]: I1010 09:01:46.056326 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerStarted","Data":"d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373"} Oct 10 09:01:46 crc kubenswrapper[4822]: I1010 09:01:46.088827 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9sr24" podStartSLOduration=2.401081384 podStartE2EDuration="7.088794044s" podCreationTimestamp="2025-10-10 09:01:39 +0000 UTC" firstStartedPulling="2025-10-10 09:01:40.974419143 +0000 UTC m=+9448.069577349" lastFinishedPulling="2025-10-10 09:01:45.662131813 +0000 UTC m=+9452.757290009" observedRunningTime="2025-10-10 09:01:46.078823059 +0000 UTC m=+9453.173981305" watchObservedRunningTime="2025-10-10 09:01:46.088794044 +0000 UTC m=+9453.183952250" Oct 10 09:01:47 crc kubenswrapper[4822]: I1010 09:01:47.651415 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:01:47 crc kubenswrapper[4822]: E1010 09:01:47.652679 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:01:50 crc kubenswrapper[4822]: I1010 09:01:50.259136 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:50 crc kubenswrapper[4822]: I1010 09:01:50.261134 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:01:51 crc kubenswrapper[4822]: I1010 09:01:51.344623 4822 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sr24" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="registry-server" probeResult="failure" output=< Oct 10 09:01:51 crc kubenswrapper[4822]: timeout: failed to connect service ":50051" within 1s Oct 10 09:01:51 crc kubenswrapper[4822]: > Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.057901 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.141520 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.337015 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dmfnh"] Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.340124 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.355662 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmfnh"] Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.519780 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-utilities\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.520224 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxkpp\" (UniqueName: \"kubernetes.io/projected/73a9f82d-3556-43e4-91fa-a5c724874b27-kube-api-access-mxkpp\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.520301 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-catalog-content\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.622287 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-utilities\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.622365 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxkpp\" (UniqueName: \"kubernetes.io/projected/73a9f82d-3556-43e4-91fa-a5c724874b27-kube-api-access-mxkpp\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.622415 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-catalog-content\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.622854 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-catalog-content\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.623083 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-utilities\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.642229 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxkpp\" (UniqueName: \"kubernetes.io/projected/73a9f82d-3556-43e4-91fa-a5c724874b27-kube-api-access-mxkpp\") pod \"redhat-marketplace-dmfnh\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:01 crc kubenswrapper[4822]: I1010 09:02:01.671716 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:02 crc kubenswrapper[4822]: I1010 09:02:02.174630 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmfnh"] Oct 10 09:02:02 crc kubenswrapper[4822]: W1010 09:02:02.177472 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a9f82d_3556_43e4_91fa_a5c724874b27.slice/crio-12129c0d14155bccd18aec01a90c7a0ff95a3a27d06d3910638bc45f58fd83e2 WatchSource:0}: Error finding container 12129c0d14155bccd18aec01a90c7a0ff95a3a27d06d3910638bc45f58fd83e2: Status 404 returned error can't find the container with id 12129c0d14155bccd18aec01a90c7a0ff95a3a27d06d3910638bc45f58fd83e2 Oct 10 09:02:02 crc kubenswrapper[4822]: I1010 09:02:02.272992 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerStarted","Data":"12129c0d14155bccd18aec01a90c7a0ff95a3a27d06d3910638bc45f58fd83e2"} Oct 10 09:02:02 crc kubenswrapper[4822]: I1010 09:02:02.649917 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:02:03 crc kubenswrapper[4822]: I1010 09:02:03.287744 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"a8ade8825345f93cd8afa0de92e7145445f537ce727be1eb0e6c2b1f670b4d73"} Oct 10 09:02:03 crc kubenswrapper[4822]: I1010 09:02:03.290040 4822 generic.go:334] "Generic (PLEG): container finished" podID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerID="782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01" exitCode=0 Oct 10 09:02:03 crc kubenswrapper[4822]: I1010 09:02:03.290084 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerDied","Data":"782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01"} Oct 10 09:02:03 crc kubenswrapper[4822]: I1010 09:02:03.507692 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9sr24"] Oct 10 09:02:03 crc kubenswrapper[4822]: I1010 09:02:03.508194 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9sr24" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="registry-server" containerID="cri-o://d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373" gracePeriod=2 Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.165836 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.288049 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-catalog-content\") pod \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.288601 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n7gp\" (UniqueName: \"kubernetes.io/projected/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-kube-api-access-4n7gp\") pod \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.290922 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-utilities\") pod \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\" (UID: \"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5\") " Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.291179 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-utilities" (OuterVolumeSpecName: "utilities") pod "06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" (UID: "06d73f0b-1b05-401a-bdf0-8d4344ef0cc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.292076 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.300026 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-kube-api-access-4n7gp" (OuterVolumeSpecName: "kube-api-access-4n7gp") pod "06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" (UID: "06d73f0b-1b05-401a-bdf0-8d4344ef0cc5"). InnerVolumeSpecName "kube-api-access-4n7gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.313566 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerStarted","Data":"a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348"} Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.328026 4822 generic.go:334] "Generic (PLEG): container finished" podID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerID="d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373" exitCode=0 Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.328099 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sr24" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.328119 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerDied","Data":"d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373"} Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.328161 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sr24" event={"ID":"06d73f0b-1b05-401a-bdf0-8d4344ef0cc5","Type":"ContainerDied","Data":"90e72e05950d3ac93b8c4eccf9d5ff8759afa92b941c02cf7f410d21acd5f688"} Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.328180 4822 scope.go:117] "RemoveContainer" containerID="d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.367537 4822 scope.go:117] "RemoveContainer" containerID="0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.389382 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" (UID: "06d73f0b-1b05-401a-bdf0-8d4344ef0cc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.393601 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.393629 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n7gp\" (UniqueName: \"kubernetes.io/projected/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5-kube-api-access-4n7gp\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.401637 4822 scope.go:117] "RemoveContainer" containerID="694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.445746 4822 scope.go:117] "RemoveContainer" containerID="d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373" Oct 10 09:02:04 crc kubenswrapper[4822]: E1010 09:02:04.446396 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373\": container with ID starting with d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373 not found: ID does not exist" containerID="d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.446468 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373"} err="failed to get container status \"d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373\": rpc error: code = NotFound desc = could not find container \"d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373\": container with ID starting with d7593d66ee1809d93852fedc51a0e37e8af8dda0f69a332a7508b68f05b80373 not found: ID does not exist" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.446516 4822 scope.go:117] "RemoveContainer" containerID="0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15" Oct 10 09:02:04 crc kubenswrapper[4822]: E1010 09:02:04.446927 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15\": container with ID starting with 0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15 not found: ID does not exist" containerID="0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.446980 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15"} err="failed to get container status \"0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15\": rpc error: code = NotFound desc = could not find container \"0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15\": container with ID starting with 0b0499f1f98037e5dcef1966a2cf81617ea764841c2a03a81b375dc179c95f15 not found: ID does not exist" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.446998 4822 scope.go:117] "RemoveContainer" containerID="694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220" Oct 10 09:02:04 crc kubenswrapper[4822]: E1010 09:02:04.447407 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220\": container with ID starting with 694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220 not found: ID does not exist" containerID="694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.447461 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220"} err="failed to get container status \"694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220\": rpc error: code = NotFound desc = could not find container \"694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220\": container with ID starting with 694554d9c0100823678a05e4c5819b8b9628dbcac2790eeeef27ef2c55cd9220 not found: ID does not exist" Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.737699 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9sr24"] Oct 10 09:02:04 crc kubenswrapper[4822]: I1010 09:02:04.750575 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9sr24"] Oct 10 09:02:05 crc kubenswrapper[4822]: I1010 09:02:05.343363 4822 generic.go:334] "Generic (PLEG): container finished" podID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerID="a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348" exitCode=0 Oct 10 09:02:05 crc kubenswrapper[4822]: I1010 09:02:05.343466 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerDied","Data":"a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348"} Oct 10 09:02:05 crc kubenswrapper[4822]: I1010 09:02:05.667360 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" path="/var/lib/kubelet/pods/06d73f0b-1b05-401a-bdf0-8d4344ef0cc5/volumes" Oct 10 09:02:06 crc kubenswrapper[4822]: I1010 09:02:06.365911 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerStarted","Data":"a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82"} Oct 10 09:02:06 crc kubenswrapper[4822]: I1010 09:02:06.388368 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dmfnh" podStartSLOduration=2.71241462 podStartE2EDuration="5.388338876s" podCreationTimestamp="2025-10-10 09:02:01 +0000 UTC" firstStartedPulling="2025-10-10 09:02:03.291790812 +0000 UTC m=+9470.386949038" lastFinishedPulling="2025-10-10 09:02:05.967715088 +0000 UTC m=+9473.062873294" observedRunningTime="2025-10-10 09:02:06.38392234 +0000 UTC m=+9473.479080556" watchObservedRunningTime="2025-10-10 09:02:06.388338876 +0000 UTC m=+9473.483497102" Oct 10 09:02:11 crc kubenswrapper[4822]: I1010 09:02:11.694480 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:11 crc kubenswrapper[4822]: I1010 09:02:11.695180 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:11 crc kubenswrapper[4822]: I1010 09:02:11.768325 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:12 crc kubenswrapper[4822]: I1010 09:02:12.522629 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:12 crc kubenswrapper[4822]: I1010 09:02:12.594156 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmfnh"] Oct 10 09:02:14 crc kubenswrapper[4822]: I1010 09:02:14.475525 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dmfnh" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="registry-server" containerID="cri-o://a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82" gracePeriod=2 Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.095055 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.283184 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-utilities\") pod \"73a9f82d-3556-43e4-91fa-a5c724874b27\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.283493 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxkpp\" (UniqueName: \"kubernetes.io/projected/73a9f82d-3556-43e4-91fa-a5c724874b27-kube-api-access-mxkpp\") pod \"73a9f82d-3556-43e4-91fa-a5c724874b27\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.283649 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-catalog-content\") pod \"73a9f82d-3556-43e4-91fa-a5c724874b27\" (UID: \"73a9f82d-3556-43e4-91fa-a5c724874b27\") " Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.284254 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-utilities" (OuterVolumeSpecName: "utilities") pod "73a9f82d-3556-43e4-91fa-a5c724874b27" (UID: "73a9f82d-3556-43e4-91fa-a5c724874b27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.293960 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a9f82d-3556-43e4-91fa-a5c724874b27-kube-api-access-mxkpp" (OuterVolumeSpecName: "kube-api-access-mxkpp") pod "73a9f82d-3556-43e4-91fa-a5c724874b27" (UID: "73a9f82d-3556-43e4-91fa-a5c724874b27"). InnerVolumeSpecName "kube-api-access-mxkpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.303109 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a9f82d-3556-43e4-91fa-a5c724874b27" (UID: "73a9f82d-3556-43e4-91fa-a5c724874b27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.385877 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxkpp\" (UniqueName: \"kubernetes.io/projected/73a9f82d-3556-43e4-91fa-a5c724874b27-kube-api-access-mxkpp\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.385912 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.385926 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a9f82d-3556-43e4-91fa-a5c724874b27-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.492778 4822 generic.go:334] "Generic (PLEG): container finished" podID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerID="a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82" exitCode=0 Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.492874 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerDied","Data":"a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82"} Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.492936 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmfnh" event={"ID":"73a9f82d-3556-43e4-91fa-a5c724874b27","Type":"ContainerDied","Data":"12129c0d14155bccd18aec01a90c7a0ff95a3a27d06d3910638bc45f58fd83e2"} Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.492931 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmfnh" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.492959 4822 scope.go:117] "RemoveContainer" containerID="a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.541301 4822 scope.go:117] "RemoveContainer" containerID="a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348" Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.551680 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmfnh"] Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.569145 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmfnh"] Oct 10 09:02:15 crc kubenswrapper[4822]: I1010 09:02:15.683783 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" path="/var/lib/kubelet/pods/73a9f82d-3556-43e4-91fa-a5c724874b27/volumes" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.203951 4822 scope.go:117] "RemoveContainer" containerID="782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.282173 4822 scope.go:117] "RemoveContainer" containerID="a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82" Oct 10 09:02:16 crc kubenswrapper[4822]: E1010 09:02:16.283710 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82\": container with ID starting with a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82 not found: ID does not exist" containerID="a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.283743 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82"} err="failed to get container status \"a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82\": rpc error: code = NotFound desc = could not find container \"a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82\": container with ID starting with a24a4c0ab9b194446bdcebf22c2b138fcbbe4934e5f73b88059f5d08986e6d82 not found: ID does not exist" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.283764 4822 scope.go:117] "RemoveContainer" containerID="a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348" Oct 10 09:02:16 crc kubenswrapper[4822]: E1010 09:02:16.284497 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348\": container with ID starting with a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348 not found: ID does not exist" containerID="a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.284546 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348"} err="failed to get container status \"a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348\": rpc error: code = NotFound desc = could not find container \"a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348\": container with ID starting with a2ae8d7336718abf0930f2cf94387c3df753b7453fed1e3c47863428d56a8348 not found: ID does not exist" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.284583 4822 scope.go:117] "RemoveContainer" containerID="782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01" Oct 10 09:02:16 crc kubenswrapper[4822]: E1010 09:02:16.285219 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01\": container with ID starting with 782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01 not found: ID does not exist" containerID="782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01" Oct 10 09:02:16 crc kubenswrapper[4822]: I1010 09:02:16.285253 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01"} err="failed to get container status \"782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01\": rpc error: code = NotFound desc = could not find container \"782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01\": container with ID starting with 782d539448c68b5ea83d4aa8585b2aec5c9b080a65f870e98b889819110a2e01 not found: ID does not exist" Oct 10 09:04:31 crc kubenswrapper[4822]: I1010 09:04:31.336638 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:04:31 crc kubenswrapper[4822]: I1010 09:04:31.337726 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:05:01 crc kubenswrapper[4822]: I1010 09:05:01.336794 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:05:01 crc kubenswrapper[4822]: I1010 09:05:01.337537 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:05:31 crc kubenswrapper[4822]: I1010 09:05:31.336288 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:05:31 crc kubenswrapper[4822]: I1010 09:05:31.336847 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:05:31 crc kubenswrapper[4822]: I1010 09:05:31.336891 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 09:05:31 crc kubenswrapper[4822]: I1010 09:05:31.337396 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8ade8825345f93cd8afa0de92e7145445f537ce727be1eb0e6c2b1f670b4d73"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:05:31 crc kubenswrapper[4822]: I1010 09:05:31.337446 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://a8ade8825345f93cd8afa0de92e7145445f537ce727be1eb0e6c2b1f670b4d73" gracePeriod=600 Oct 10 09:05:32 crc kubenswrapper[4822]: I1010 09:05:32.049706 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="a8ade8825345f93cd8afa0de92e7145445f537ce727be1eb0e6c2b1f670b4d73" exitCode=0 Oct 10 09:05:32 crc kubenswrapper[4822]: I1010 09:05:32.049998 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"a8ade8825345f93cd8afa0de92e7145445f537ce727be1eb0e6c2b1f670b4d73"} Oct 10 09:05:32 crc kubenswrapper[4822]: I1010 09:05:32.050269 4822 scope.go:117] "RemoveContainer" containerID="8dc18b24e0168707a505beba2e59fda5d449aacc246db42d6a0dd40e3f312f02" Oct 10 09:05:33 crc kubenswrapper[4822]: I1010 09:05:33.066289 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35"} Oct 10 09:08:01 crc kubenswrapper[4822]: I1010 09:08:01.337306 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:08:01 crc kubenswrapper[4822]: I1010 09:08:01.337956 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:08:18 crc kubenswrapper[4822]: I1010 09:08:18.270371 4822 generic.go:334] "Generic (PLEG): container finished" podID="1992f703-ffd6-47f1-ad78-44790cb9e0be" containerID="6a8933f0e733a9bd8396375bb6d4e5336f8abb66b65e535d0d06a114524c9dd1" exitCode=0 Oct 10 09:08:18 crc kubenswrapper[4822]: I1010 09:08:18.270557 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" event={"ID":"1992f703-ffd6-47f1-ad78-44790cb9e0be","Type":"ContainerDied","Data":"6a8933f0e733a9bd8396375bb6d4e5336f8abb66b65e535d0d06a114524c9dd1"} Oct 10 09:08:19 crc kubenswrapper[4822]: I1010 09:08:19.833835 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.000693 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-combined-ca-bundle\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001125 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-1\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001182 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-0\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001210 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-0\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001266 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ceph\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001402 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-inventory\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001454 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2qll\" (UniqueName: \"kubernetes.io/projected/1992f703-ffd6-47f1-ad78-44790cb9e0be-kube-api-access-d2qll\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001511 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-0\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001585 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-1\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001612 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ssh-key\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.001638 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.015312 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.015575 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ceph" (OuterVolumeSpecName: "ceph") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.035476 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1992f703-ffd6-47f1-ad78-44790cb9e0be-kube-api-access-d2qll" (OuterVolumeSpecName: "kube-api-access-d2qll") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "kube-api-access-d2qll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.051583 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.056367 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.063781 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.064675 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-inventory" (OuterVolumeSpecName: "inventory") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.105398 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.106911 4822 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.106937 4822 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.106948 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.106962 4822 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-ceph\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.106976 4822 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.106988 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2qll\" (UniqueName: \"kubernetes.io/projected/1992f703-ffd6-47f1-ad78-44790cb9e0be-kube-api-access-d2qll\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.308913 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" event={"ID":"1992f703-ffd6-47f1-ad78-44790cb9e0be","Type":"ContainerDied","Data":"9171394c6d85bdc71cc1b63b6b42613609180d29a59f64f25d28ef45e1200fd8"} Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.308953 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9171394c6d85bdc71cc1b63b6b42613609180d29a59f64f25d28ef45e1200fd8" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.308983 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.771378 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.800629 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.813114 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.823275 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.824431 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1\") pod \"1992f703-ffd6-47f1-ad78-44790cb9e0be\" (UID: \"1992f703-ffd6-47f1-ad78-44790cb9e0be\") " Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.825235 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.825261 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.825276 4822 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:20 crc kubenswrapper[4822]: W1010 09:08:20.826164 4822 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1992f703-ffd6-47f1-ad78-44790cb9e0be/volumes/kubernetes.io~secret/nova-cell1-compute-config-1 Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.826186 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1992f703-ffd6-47f1-ad78-44790cb9e0be" (UID: "1992f703-ffd6-47f1-ad78-44790cb9e0be"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:08:20 crc kubenswrapper[4822]: I1010 09:08:20.927644 4822 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1992f703-ffd6-47f1-ad78-44790cb9e0be-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:08:31 crc kubenswrapper[4822]: I1010 09:08:31.337302 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:08:31 crc kubenswrapper[4822]: I1010 09:08:31.338179 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:08:55 crc kubenswrapper[4822]: E1010 09:08:55.886592 4822 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.180:41676->38.102.83.180:44473: write tcp 38.102.83.180:41676->38.102.83.180:44473: write: broken pipe Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.337601 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.338271 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.338353 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.339312 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.339391 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" gracePeriod=600 Oct 10 09:09:01 crc kubenswrapper[4822]: E1010 09:09:01.470890 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.785029 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" exitCode=0 Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.785074 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35"} Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.785123 4822 scope.go:117] "RemoveContainer" containerID="a8ade8825345f93cd8afa0de92e7145445f537ce727be1eb0e6c2b1f670b4d73" Oct 10 09:09:01 crc kubenswrapper[4822]: I1010 09:09:01.786644 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:09:01 crc kubenswrapper[4822]: E1010 09:09:01.787537 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:09:13 crc kubenswrapper[4822]: I1010 09:09:13.658616 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:09:13 crc kubenswrapper[4822]: E1010 09:09:13.659562 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:09:27 crc kubenswrapper[4822]: I1010 09:09:27.650056 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:09:27 crc kubenswrapper[4822]: E1010 09:09:27.650678 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:09:39 crc kubenswrapper[4822]: I1010 09:09:39.657716 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:09:39 crc kubenswrapper[4822]: E1010 09:09:39.659742 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:09:50 crc kubenswrapper[4822]: I1010 09:09:50.650750 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:09:50 crc kubenswrapper[4822]: E1010 09:09:50.651671 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:10:03 crc kubenswrapper[4822]: I1010 09:10:03.668927 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:10:03 crc kubenswrapper[4822]: E1010 09:10:03.669782 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:10:15 crc kubenswrapper[4822]: I1010 09:10:15.651288 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:10:15 crc kubenswrapper[4822]: E1010 09:10:15.652288 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:10:25 crc kubenswrapper[4822]: E1010 09:10:25.114062 4822 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.180:59692->38.102.83.180:44473: write tcp 38.102.83.180:59692->38.102.83.180:44473: write: broken pipe Oct 10 09:10:29 crc kubenswrapper[4822]: I1010 09:10:29.652018 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:10:29 crc kubenswrapper[4822]: E1010 09:10:29.653378 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:10:42 crc kubenswrapper[4822]: I1010 09:10:42.006955 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 09:10:42 crc kubenswrapper[4822]: I1010 09:10:42.007848 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="6e167c28-a75e-4df7-b218-c5b29939fa82" containerName="adoption" containerID="cri-o://7cc3f7a987ae0eb589309b3b9ea2f78701ee5ede22eaa30cfb8bdc7a2a24c3b1" gracePeriod=30 Oct 10 09:10:43 crc kubenswrapper[4822]: I1010 09:10:43.662671 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:10:43 crc kubenswrapper[4822]: E1010 09:10:43.663229 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:10:56 crc kubenswrapper[4822]: I1010 09:10:56.651114 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:10:56 crc kubenswrapper[4822]: E1010 09:10:56.652169 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.995966 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4gbl"] Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997499 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="extract-utilities" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997519 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="extract-utilities" Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997536 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="registry-server" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997544 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="registry-server" Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997560 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="extract-utilities" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997570 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="extract-utilities" Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997603 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1992f703-ffd6-47f1-ad78-44790cb9e0be" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997613 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="1992f703-ffd6-47f1-ad78-44790cb9e0be" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997627 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="extract-content" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997635 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="extract-content" Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997650 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="registry-server" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997657 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="registry-server" Oct 10 09:11:08 crc kubenswrapper[4822]: E1010 09:11:08.997681 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="extract-content" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997690 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="extract-content" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997963 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="1992f703-ffd6-47f1-ad78-44790cb9e0be" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.997993 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d73f0b-1b05-401a-bdf0-8d4344ef0cc5" containerName="registry-server" Oct 10 09:11:08 crc kubenswrapper[4822]: I1010 09:11:08.998009 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a9f82d-3556-43e4-91fa-a5c724874b27" containerName="registry-server" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.001111 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.037060 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4gbl"] Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.088992 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-catalog-content\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.089200 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-utilities\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.089319 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2p5g\" (UniqueName: \"kubernetes.io/projected/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-kube-api-access-b2p5g\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.192046 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-catalog-content\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.192305 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-utilities\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.192502 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2p5g\" (UniqueName: \"kubernetes.io/projected/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-kube-api-access-b2p5g\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.192886 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-catalog-content\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.192892 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-utilities\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.214720 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2p5g\" (UniqueName: \"kubernetes.io/projected/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-kube-api-access-b2p5g\") pod \"certified-operators-p4gbl\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.350700 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:09 crc kubenswrapper[4822]: I1010 09:11:09.961795 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4gbl"] Oct 10 09:11:11 crc kubenswrapper[4822]: I1010 09:11:11.347468 4822 generic.go:334] "Generic (PLEG): container finished" podID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerID="d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca" exitCode=0 Oct 10 09:11:11 crc kubenswrapper[4822]: I1010 09:11:11.347574 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerDied","Data":"d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca"} Oct 10 09:11:11 crc kubenswrapper[4822]: I1010 09:11:11.347981 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerStarted","Data":"b73d2bed4bb08f518bd11393da2280269e7ceb75fb701e6519170d21307c9303"} Oct 10 09:11:11 crc kubenswrapper[4822]: I1010 09:11:11.350637 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:11:11 crc kubenswrapper[4822]: I1010 09:11:11.650960 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:11:11 crc kubenswrapper[4822]: E1010 09:11:11.652334 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:11:12 crc kubenswrapper[4822]: E1010 09:11:12.370597 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e167c28_a75e_4df7_b218_c5b29939fa82.slice/crio-7cc3f7a987ae0eb589309b3b9ea2f78701ee5ede22eaa30cfb8bdc7a2a24c3b1.scope\": RecentStats: unable to find data in memory cache]" Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.414500 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerStarted","Data":"b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2"} Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.417634 4822 generic.go:334] "Generic (PLEG): container finished" podID="6e167c28-a75e-4df7-b218-c5b29939fa82" containerID="7cc3f7a987ae0eb589309b3b9ea2f78701ee5ede22eaa30cfb8bdc7a2a24c3b1" exitCode=137 Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.417705 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"6e167c28-a75e-4df7-b218-c5b29939fa82","Type":"ContainerDied","Data":"7cc3f7a987ae0eb589309b3b9ea2f78701ee5ede22eaa30cfb8bdc7a2a24c3b1"} Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.569378 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.673861 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22lc\" (UniqueName: \"kubernetes.io/projected/6e167c28-a75e-4df7-b218-c5b29939fa82-kube-api-access-w22lc\") pod \"6e167c28-a75e-4df7-b218-c5b29939fa82\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.674432 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") pod \"6e167c28-a75e-4df7-b218-c5b29939fa82\" (UID: \"6e167c28-a75e-4df7-b218-c5b29939fa82\") " Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.687234 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e167c28-a75e-4df7-b218-c5b29939fa82-kube-api-access-w22lc" (OuterVolumeSpecName: "kube-api-access-w22lc") pod "6e167c28-a75e-4df7-b218-c5b29939fa82" (UID: "6e167c28-a75e-4df7-b218-c5b29939fa82"). InnerVolumeSpecName "kube-api-access-w22lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.701279 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc" (OuterVolumeSpecName: "mariadb-data") pod "6e167c28-a75e-4df7-b218-c5b29939fa82" (UID: "6e167c28-a75e-4df7-b218-c5b29939fa82"). InnerVolumeSpecName "pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.777454 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w22lc\" (UniqueName: \"kubernetes.io/projected/6e167c28-a75e-4df7-b218-c5b29939fa82-kube-api-access-w22lc\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.777856 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") on node \"crc\" " Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.805037 4822 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.805228 4822 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc") on node "crc" Oct 10 09:11:12 crc kubenswrapper[4822]: I1010 09:11:12.879972 4822 reconciler_common.go:293] "Volume detached for volume \"pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de16813-4d4e-4bc6-b231-d2e168be6ccc\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.434352 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.434386 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"6e167c28-a75e-4df7-b218-c5b29939fa82","Type":"ContainerDied","Data":"053213e0bfdd444e4b5fe3374e814b64343cda59cc40280950c1ca0c140c5692"} Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.434482 4822 scope.go:117] "RemoveContainer" containerID="7cc3f7a987ae0eb589309b3b9ea2f78701ee5ede22eaa30cfb8bdc7a2a24c3b1" Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.441621 4822 generic.go:334] "Generic (PLEG): container finished" podID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerID="b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2" exitCode=0 Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.441715 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerDied","Data":"b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2"} Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.515382 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.524373 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 09:11:13 crc kubenswrapper[4822]: I1010 09:11:13.670997 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e167c28-a75e-4df7-b218-c5b29939fa82" path="/var/lib/kubelet/pods/6e167c28-a75e-4df7-b218-c5b29939fa82/volumes" Oct 10 09:11:14 crc kubenswrapper[4822]: I1010 09:11:14.248858 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 09:11:14 crc kubenswrapper[4822]: I1010 09:11:14.249411 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="6a6d3351-99a2-43e0-94a7-9610fe9186eb" containerName="adoption" containerID="cri-o://4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df" gracePeriod=30 Oct 10 09:11:14 crc kubenswrapper[4822]: I1010 09:11:14.457285 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerStarted","Data":"f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58"} Oct 10 09:11:14 crc kubenswrapper[4822]: I1010 09:11:14.493033 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4gbl" podStartSLOduration=3.728194624 podStartE2EDuration="6.493007436s" podCreationTimestamp="2025-10-10 09:11:08 +0000 UTC" firstStartedPulling="2025-10-10 09:11:11.350205647 +0000 UTC m=+10018.445363853" lastFinishedPulling="2025-10-10 09:11:14.115018429 +0000 UTC m=+10021.210176665" observedRunningTime="2025-10-10 09:11:14.47916456 +0000 UTC m=+10021.574322816" watchObservedRunningTime="2025-10-10 09:11:14.493007436 +0000 UTC m=+10021.588165662" Oct 10 09:11:19 crc kubenswrapper[4822]: I1010 09:11:19.351581 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:19 crc kubenswrapper[4822]: I1010 09:11:19.352317 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:19 crc kubenswrapper[4822]: I1010 09:11:19.403367 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:19 crc kubenswrapper[4822]: I1010 09:11:19.587465 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:19 crc kubenswrapper[4822]: I1010 09:11:19.687600 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4gbl"] Oct 10 09:11:21 crc kubenswrapper[4822]: I1010 09:11:21.543380 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4gbl" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="registry-server" containerID="cri-o://f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58" gracePeriod=2 Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.087082 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.092366 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2p5g\" (UniqueName: \"kubernetes.io/projected/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-kube-api-access-b2p5g\") pod \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.092419 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-utilities\") pod \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.092494 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-catalog-content\") pod \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\" (UID: \"33432ebf-1ca3-4c3a-99ed-2a258f781ccb\") " Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.093462 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-utilities" (OuterVolumeSpecName: "utilities") pod "33432ebf-1ca3-4c3a-99ed-2a258f781ccb" (UID: "33432ebf-1ca3-4c3a-99ed-2a258f781ccb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.099045 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-kube-api-access-b2p5g" (OuterVolumeSpecName: "kube-api-access-b2p5g") pod "33432ebf-1ca3-4c3a-99ed-2a258f781ccb" (UID: "33432ebf-1ca3-4c3a-99ed-2a258f781ccb"). InnerVolumeSpecName "kube-api-access-b2p5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.167303 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33432ebf-1ca3-4c3a-99ed-2a258f781ccb" (UID: "33432ebf-1ca3-4c3a-99ed-2a258f781ccb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.195828 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2p5g\" (UniqueName: \"kubernetes.io/projected/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-kube-api-access-b2p5g\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.196223 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.196317 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33432ebf-1ca3-4c3a-99ed-2a258f781ccb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.566421 4822 generic.go:334] "Generic (PLEG): container finished" podID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerID="f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58" exitCode=0 Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.566593 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerDied","Data":"f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58"} Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.566689 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4gbl" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.566714 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4gbl" event={"ID":"33432ebf-1ca3-4c3a-99ed-2a258f781ccb","Type":"ContainerDied","Data":"b73d2bed4bb08f518bd11393da2280269e7ceb75fb701e6519170d21307c9303"} Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.566788 4822 scope.go:117] "RemoveContainer" containerID="f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.631679 4822 scope.go:117] "RemoveContainer" containerID="b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.643519 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4gbl"] Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.656979 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4gbl"] Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.665191 4822 scope.go:117] "RemoveContainer" containerID="d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.720007 4822 scope.go:117] "RemoveContainer" containerID="f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58" Oct 10 09:11:22 crc kubenswrapper[4822]: E1010 09:11:22.720621 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58\": container with ID starting with f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58 not found: ID does not exist" containerID="f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.720703 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58"} err="failed to get container status \"f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58\": rpc error: code = NotFound desc = could not find container \"f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58\": container with ID starting with f9fa538f869c72d960b161b4cda43e63216d41e5082c39cbf0ee7ad4837d5d58 not found: ID does not exist" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.720750 4822 scope.go:117] "RemoveContainer" containerID="b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2" Oct 10 09:11:22 crc kubenswrapper[4822]: E1010 09:11:22.721279 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2\": container with ID starting with b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2 not found: ID does not exist" containerID="b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.721360 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2"} err="failed to get container status \"b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2\": rpc error: code = NotFound desc = could not find container \"b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2\": container with ID starting with b5cd31a130f4a2a21021e89ff0c967352ecee781b5ac48d4690102c73f4a5fa2 not found: ID does not exist" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.721385 4822 scope.go:117] "RemoveContainer" containerID="d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca" Oct 10 09:11:22 crc kubenswrapper[4822]: E1010 09:11:22.721864 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca\": container with ID starting with d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca not found: ID does not exist" containerID="d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca" Oct 10 09:11:22 crc kubenswrapper[4822]: I1010 09:11:22.721913 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca"} err="failed to get container status \"d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca\": rpc error: code = NotFound desc = could not find container \"d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca\": container with ID starting with d3a7396be8da755a43ef047a7a2b7d2bd6573d0b84e8d0933cdb1e2cda9932ca not found: ID does not exist" Oct 10 09:11:22 crc kubenswrapper[4822]: E1010 09:11:22.779279 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33432ebf_1ca3_4c3a_99ed_2a258f781ccb.slice\": RecentStats: unable to find data in memory cache]" Oct 10 09:11:23 crc kubenswrapper[4822]: I1010 09:11:23.691280 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" path="/var/lib/kubelet/pods/33432ebf-1ca3-4c3a-99ed-2a258f781ccb/volumes" Oct 10 09:11:25 crc kubenswrapper[4822]: I1010 09:11:25.650956 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:11:25 crc kubenswrapper[4822]: E1010 09:11:25.652082 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:11:39 crc kubenswrapper[4822]: I1010 09:11:39.651991 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:11:39 crc kubenswrapper[4822]: E1010 09:11:39.653164 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.385479 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.543723 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") pod \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.543859 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gln77\" (UniqueName: \"kubernetes.io/projected/6a6d3351-99a2-43e0-94a7-9610fe9186eb-kube-api-access-gln77\") pod \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.543976 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6a6d3351-99a2-43e0-94a7-9610fe9186eb-ovn-data-cert\") pod \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\" (UID: \"6a6d3351-99a2-43e0-94a7-9610fe9186eb\") " Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.556051 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6d3351-99a2-43e0-94a7-9610fe9186eb-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "6a6d3351-99a2-43e0-94a7-9610fe9186eb" (UID: "6a6d3351-99a2-43e0-94a7-9610fe9186eb"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.558521 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6d3351-99a2-43e0-94a7-9610fe9186eb-kube-api-access-gln77" (OuterVolumeSpecName: "kube-api-access-gln77") pod "6a6d3351-99a2-43e0-94a7-9610fe9186eb" (UID: "6a6d3351-99a2-43e0-94a7-9610fe9186eb"). InnerVolumeSpecName "kube-api-access-gln77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.560182 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885" (OuterVolumeSpecName: "ovn-data") pod "6a6d3351-99a2-43e0-94a7-9610fe9186eb" (UID: "6a6d3351-99a2-43e0-94a7-9610fe9186eb"). InnerVolumeSpecName "pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.646947 4822 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") on node \"crc\" " Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.647001 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gln77\" (UniqueName: \"kubernetes.io/projected/6a6d3351-99a2-43e0-94a7-9610fe9186eb-kube-api-access-gln77\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.647028 4822 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6a6d3351-99a2-43e0-94a7-9610fe9186eb-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.682778 4822 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.683062 4822 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885") on node "crc" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.754960 4822 reconciler_common.go:293] "Volume detached for volume \"pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7a436bc9-5cc1-400f-82d5-9620cb5e3885\") on node \"crc\" DevicePath \"\"" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.887560 4822 generic.go:334] "Generic (PLEG): container finished" podID="6a6d3351-99a2-43e0-94a7-9610fe9186eb" containerID="4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df" exitCode=137 Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.887746 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6a6d3351-99a2-43e0-94a7-9610fe9186eb","Type":"ContainerDied","Data":"4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df"} Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.887777 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.888785 4822 scope.go:117] "RemoveContainer" containerID="4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.888883 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6a6d3351-99a2-43e0-94a7-9610fe9186eb","Type":"ContainerDied","Data":"3e7ce490735967f14de67340df57fea61d6c4495e78ea53eb0431784935ce1b3"} Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.928090 4822 scope.go:117] "RemoveContainer" containerID="4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df" Oct 10 09:11:45 crc kubenswrapper[4822]: E1010 09:11:45.929155 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df\": container with ID starting with 4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df not found: ID does not exist" containerID="4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.929226 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df"} err="failed to get container status \"4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df\": rpc error: code = NotFound desc = could not find container \"4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df\": container with ID starting with 4f3f27b71c89307f2b2ff9dec55c97c4ffd7f47ba58c87c64c84daaae7b249df not found: ID does not exist" Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.940752 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 09:11:45 crc kubenswrapper[4822]: I1010 09:11:45.956059 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 09:11:47 crc kubenswrapper[4822]: I1010 09:11:47.664832 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6d3351-99a2-43e0-94a7-9610fe9186eb" path="/var/lib/kubelet/pods/6a6d3351-99a2-43e0-94a7-9610fe9186eb/volumes" Oct 10 09:11:53 crc kubenswrapper[4822]: I1010 09:11:53.665966 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:11:53 crc kubenswrapper[4822]: E1010 09:11:53.667673 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:12:07 crc kubenswrapper[4822]: I1010 09:12:07.650037 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:12:07 crc kubenswrapper[4822]: E1010 09:12:07.652151 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.247138 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5r6"] Oct 10 09:12:11 crc kubenswrapper[4822]: E1010 09:12:11.250626 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6d3351-99a2-43e0-94a7-9610fe9186eb" containerName="adoption" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.250881 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6d3351-99a2-43e0-94a7-9610fe9186eb" containerName="adoption" Oct 10 09:12:11 crc kubenswrapper[4822]: E1010 09:12:11.251054 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="registry-server" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.251193 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="registry-server" Oct 10 09:12:11 crc kubenswrapper[4822]: E1010 09:12:11.251362 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="extract-content" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.251672 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="extract-content" Oct 10 09:12:11 crc kubenswrapper[4822]: E1010 09:12:11.251957 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="extract-utilities" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.252134 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="extract-utilities" Oct 10 09:12:11 crc kubenswrapper[4822]: E1010 09:12:11.252443 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e167c28-a75e-4df7-b218-c5b29939fa82" containerName="adoption" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.252597 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e167c28-a75e-4df7-b218-c5b29939fa82" containerName="adoption" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.253195 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e167c28-a75e-4df7-b218-c5b29939fa82" containerName="adoption" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.253390 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6d3351-99a2-43e0-94a7-9610fe9186eb" containerName="adoption" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.253593 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="33432ebf-1ca3-4c3a-99ed-2a258f781ccb" containerName="registry-server" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.257519 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.264289 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5r6"] Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.332992 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl9hh\" (UniqueName: \"kubernetes.io/projected/3394be44-2f37-4d0a-8f74-161557767220-kube-api-access-jl9hh\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.333406 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-utilities\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.333476 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-catalog-content\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.435708 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl9hh\" (UniqueName: \"kubernetes.io/projected/3394be44-2f37-4d0a-8f74-161557767220-kube-api-access-jl9hh\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.435786 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-utilities\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.435861 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-catalog-content\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.436522 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-catalog-content\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.436683 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-utilities\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.464545 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl9hh\" (UniqueName: \"kubernetes.io/projected/3394be44-2f37-4d0a-8f74-161557767220-kube-api-access-jl9hh\") pod \"redhat-marketplace-wl5r6\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:11 crc kubenswrapper[4822]: I1010 09:12:11.609447 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:12 crc kubenswrapper[4822]: I1010 09:12:12.175269 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5r6"] Oct 10 09:12:12 crc kubenswrapper[4822]: I1010 09:12:12.275007 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerStarted","Data":"0cf94c0da6b30b24e1c219adfe79345425373cee0822365b1ed5348530510ddd"} Oct 10 09:12:13 crc kubenswrapper[4822]: I1010 09:12:13.289686 4822 generic.go:334] "Generic (PLEG): container finished" podID="3394be44-2f37-4d0a-8f74-161557767220" containerID="d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995" exitCode=0 Oct 10 09:12:13 crc kubenswrapper[4822]: I1010 09:12:13.289724 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerDied","Data":"d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995"} Oct 10 09:12:14 crc kubenswrapper[4822]: I1010 09:12:14.316954 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerStarted","Data":"8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c"} Oct 10 09:12:14 crc kubenswrapper[4822]: E1010 09:12:14.516337 4822 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3394be44_2f37_4d0a_8f74_161557767220.slice/crio-conmon-8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3394be44_2f37_4d0a_8f74_161557767220.slice/crio-8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c.scope\": RecentStats: unable to find data in memory cache]" Oct 10 09:12:15 crc kubenswrapper[4822]: I1010 09:12:15.341150 4822 generic.go:334] "Generic (PLEG): container finished" podID="3394be44-2f37-4d0a-8f74-161557767220" containerID="8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c" exitCode=0 Oct 10 09:12:15 crc kubenswrapper[4822]: I1010 09:12:15.341458 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerDied","Data":"8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c"} Oct 10 09:12:16 crc kubenswrapper[4822]: I1010 09:12:16.355993 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerStarted","Data":"792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0"} Oct 10 09:12:16 crc kubenswrapper[4822]: I1010 09:12:16.387948 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wl5r6" podStartSLOduration=2.774636116 podStartE2EDuration="5.387927109s" podCreationTimestamp="2025-10-10 09:12:11 +0000 UTC" firstStartedPulling="2025-10-10 09:12:13.292106894 +0000 UTC m=+10080.387265130" lastFinishedPulling="2025-10-10 09:12:15.905397887 +0000 UTC m=+10083.000556123" observedRunningTime="2025-10-10 09:12:16.381219636 +0000 UTC m=+10083.476377852" watchObservedRunningTime="2025-10-10 09:12:16.387927109 +0000 UTC m=+10083.483085315" Oct 10 09:12:21 crc kubenswrapper[4822]: I1010 09:12:21.610548 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:21 crc kubenswrapper[4822]: I1010 09:12:21.611222 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:21 crc kubenswrapper[4822]: I1010 09:12:21.651132 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:12:21 crc kubenswrapper[4822]: E1010 09:12:21.651735 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:12:21 crc kubenswrapper[4822]: I1010 09:12:21.695832 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:22 crc kubenswrapper[4822]: I1010 09:12:22.495352 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:22 crc kubenswrapper[4822]: I1010 09:12:22.585927 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5r6"] Oct 10 09:12:24 crc kubenswrapper[4822]: I1010 09:12:24.448558 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wl5r6" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="registry-server" containerID="cri-o://792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0" gracePeriod=2 Oct 10 09:12:24 crc kubenswrapper[4822]: I1010 09:12:24.981293 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.120872 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-utilities\") pod \"3394be44-2f37-4d0a-8f74-161557767220\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.121018 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl9hh\" (UniqueName: \"kubernetes.io/projected/3394be44-2f37-4d0a-8f74-161557767220-kube-api-access-jl9hh\") pod \"3394be44-2f37-4d0a-8f74-161557767220\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.121134 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-catalog-content\") pod \"3394be44-2f37-4d0a-8f74-161557767220\" (UID: \"3394be44-2f37-4d0a-8f74-161557767220\") " Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.122671 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-utilities" (OuterVolumeSpecName: "utilities") pod "3394be44-2f37-4d0a-8f74-161557767220" (UID: "3394be44-2f37-4d0a-8f74-161557767220"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.133782 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3394be44-2f37-4d0a-8f74-161557767220-kube-api-access-jl9hh" (OuterVolumeSpecName: "kube-api-access-jl9hh") pod "3394be44-2f37-4d0a-8f74-161557767220" (UID: "3394be44-2f37-4d0a-8f74-161557767220"). InnerVolumeSpecName "kube-api-access-jl9hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.146440 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3394be44-2f37-4d0a-8f74-161557767220" (UID: "3394be44-2f37-4d0a-8f74-161557767220"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.224044 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.224378 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394be44-2f37-4d0a-8f74-161557767220-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.224388 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl9hh\" (UniqueName: \"kubernetes.io/projected/3394be44-2f37-4d0a-8f74-161557767220-kube-api-access-jl9hh\") on node \"crc\" DevicePath \"\"" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.462153 4822 generic.go:334] "Generic (PLEG): container finished" podID="3394be44-2f37-4d0a-8f74-161557767220" containerID="792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0" exitCode=0 Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.462206 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerDied","Data":"792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0"} Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.462278 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5r6" event={"ID":"3394be44-2f37-4d0a-8f74-161557767220","Type":"ContainerDied","Data":"0cf94c0da6b30b24e1c219adfe79345425373cee0822365b1ed5348530510ddd"} Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.462309 4822 scope.go:117] "RemoveContainer" containerID="792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.464931 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5r6" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.518533 4822 scope.go:117] "RemoveContainer" containerID="8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.532188 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5r6"] Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.552235 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5r6"] Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.570887 4822 scope.go:117] "RemoveContainer" containerID="d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.626184 4822 scope.go:117] "RemoveContainer" containerID="792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0" Oct 10 09:12:25 crc kubenswrapper[4822]: E1010 09:12:25.626662 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0\": container with ID starting with 792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0 not found: ID does not exist" containerID="792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.626737 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0"} err="failed to get container status \"792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0\": rpc error: code = NotFound desc = could not find container \"792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0\": container with ID starting with 792369618e74338f9134e6f0317349a91822b20d996ca3bda0c85c52ce8844a0 not found: ID does not exist" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.626762 4822 scope.go:117] "RemoveContainer" containerID="8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c" Oct 10 09:12:25 crc kubenswrapper[4822]: E1010 09:12:25.627302 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c\": container with ID starting with 8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c not found: ID does not exist" containerID="8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.627336 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c"} err="failed to get container status \"8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c\": rpc error: code = NotFound desc = could not find container \"8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c\": container with ID starting with 8c50ecfb97cdafd4f51e6585fd75816671df8096bed1af7789c3ac9a21df2a3c not found: ID does not exist" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.627355 4822 scope.go:117] "RemoveContainer" containerID="d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995" Oct 10 09:12:25 crc kubenswrapper[4822]: E1010 09:12:25.627867 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995\": container with ID starting with d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995 not found: ID does not exist" containerID="d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.628048 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995"} err="failed to get container status \"d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995\": rpc error: code = NotFound desc = could not find container \"d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995\": container with ID starting with d4b4dd21ebc885fbf6534cdd53d19dd42bec50ecaa1dbc9e3519453123f78995 not found: ID does not exist" Oct 10 09:12:25 crc kubenswrapper[4822]: I1010 09:12:25.669504 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3394be44-2f37-4d0a-8f74-161557767220" path="/var/lib/kubelet/pods/3394be44-2f37-4d0a-8f74-161557767220/volumes" Oct 10 09:12:35 crc kubenswrapper[4822]: I1010 09:12:35.652128 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:12:35 crc kubenswrapper[4822]: E1010 09:12:35.654002 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.055628 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bg4p2"] Oct 10 09:12:36 crc kubenswrapper[4822]: E1010 09:12:36.056369 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="registry-server" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.056399 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="registry-server" Oct 10 09:12:36 crc kubenswrapper[4822]: E1010 09:12:36.056440 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="extract-utilities" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.056455 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="extract-utilities" Oct 10 09:12:36 crc kubenswrapper[4822]: E1010 09:12:36.056528 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="extract-content" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.056538 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="extract-content" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.058018 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="3394be44-2f37-4d0a-8f74-161557767220" containerName="registry-server" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.071435 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.084470 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bg4p2"] Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.204156 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5nd\" (UniqueName: \"kubernetes.io/projected/25aa03be-1c23-4300-bb08-e0d3ce05f797-kube-api-access-2c5nd\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.204649 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-utilities\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.204748 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-catalog-content\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.306108 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5nd\" (UniqueName: \"kubernetes.io/projected/25aa03be-1c23-4300-bb08-e0d3ce05f797-kube-api-access-2c5nd\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.306243 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-utilities\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.306278 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-catalog-content\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.306841 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-catalog-content\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.306844 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-utilities\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.480255 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5nd\" (UniqueName: \"kubernetes.io/projected/25aa03be-1c23-4300-bb08-e0d3ce05f797-kube-api-access-2c5nd\") pod \"community-operators-bg4p2\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:36 crc kubenswrapper[4822]: I1010 09:12:36.708352 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:37 crc kubenswrapper[4822]: I1010 09:12:37.152209 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bg4p2"] Oct 10 09:12:37 crc kubenswrapper[4822]: I1010 09:12:37.688107 4822 generic.go:334] "Generic (PLEG): container finished" podID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerID="589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0" exitCode=0 Oct 10 09:12:37 crc kubenswrapper[4822]: I1010 09:12:37.697379 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerDied","Data":"589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0"} Oct 10 09:12:37 crc kubenswrapper[4822]: I1010 09:12:37.697459 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerStarted","Data":"0af751aba3978779681e3c53b7c461a77f2e3dd1a6f991c622b3c5b7b54517c6"} Oct 10 09:12:39 crc kubenswrapper[4822]: I1010 09:12:39.713625 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerStarted","Data":"928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550"} Oct 10 09:12:40 crc kubenswrapper[4822]: I1010 09:12:40.739477 4822 generic.go:334] "Generic (PLEG): container finished" podID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerID="928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550" exitCode=0 Oct 10 09:12:40 crc kubenswrapper[4822]: I1010 09:12:40.739677 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerDied","Data":"928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550"} Oct 10 09:12:41 crc kubenswrapper[4822]: I1010 09:12:41.756894 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerStarted","Data":"df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f"} Oct 10 09:12:46 crc kubenswrapper[4822]: I1010 09:12:46.709297 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:46 crc kubenswrapper[4822]: I1010 09:12:46.710146 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:46 crc kubenswrapper[4822]: I1010 09:12:46.790148 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:46 crc kubenswrapper[4822]: I1010 09:12:46.840506 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bg4p2" podStartSLOduration=8.339086351 podStartE2EDuration="11.840460944s" podCreationTimestamp="2025-10-10 09:12:35 +0000 UTC" firstStartedPulling="2025-10-10 09:12:37.691519201 +0000 UTC m=+10104.786677407" lastFinishedPulling="2025-10-10 09:12:41.192893764 +0000 UTC m=+10108.288052000" observedRunningTime="2025-10-10 09:12:41.783035278 +0000 UTC m=+10108.878193514" watchObservedRunningTime="2025-10-10 09:12:46.840460944 +0000 UTC m=+10113.935619160" Oct 10 09:12:46 crc kubenswrapper[4822]: I1010 09:12:46.895061 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:47 crc kubenswrapper[4822]: I1010 09:12:47.038940 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bg4p2"] Oct 10 09:12:47 crc kubenswrapper[4822]: I1010 09:12:47.650867 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:12:47 crc kubenswrapper[4822]: E1010 09:12:47.651435 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:12:48 crc kubenswrapper[4822]: I1010 09:12:48.843394 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bg4p2" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="registry-server" containerID="cri-o://df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f" gracePeriod=2 Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.358346 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.451455 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c5nd\" (UniqueName: \"kubernetes.io/projected/25aa03be-1c23-4300-bb08-e0d3ce05f797-kube-api-access-2c5nd\") pod \"25aa03be-1c23-4300-bb08-e0d3ce05f797\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.452014 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-catalog-content\") pod \"25aa03be-1c23-4300-bb08-e0d3ce05f797\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.452107 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-utilities\") pod \"25aa03be-1c23-4300-bb08-e0d3ce05f797\" (UID: \"25aa03be-1c23-4300-bb08-e0d3ce05f797\") " Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.453465 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-utilities" (OuterVolumeSpecName: "utilities") pod "25aa03be-1c23-4300-bb08-e0d3ce05f797" (UID: "25aa03be-1c23-4300-bb08-e0d3ce05f797"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.454219 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.458671 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25aa03be-1c23-4300-bb08-e0d3ce05f797-kube-api-access-2c5nd" (OuterVolumeSpecName: "kube-api-access-2c5nd") pod "25aa03be-1c23-4300-bb08-e0d3ce05f797" (UID: "25aa03be-1c23-4300-bb08-e0d3ce05f797"). InnerVolumeSpecName "kube-api-access-2c5nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.510419 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25aa03be-1c23-4300-bb08-e0d3ce05f797" (UID: "25aa03be-1c23-4300-bb08-e0d3ce05f797"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.556912 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c5nd\" (UniqueName: \"kubernetes.io/projected/25aa03be-1c23-4300-bb08-e0d3ce05f797-kube-api-access-2c5nd\") on node \"crc\" DevicePath \"\"" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.556973 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25aa03be-1c23-4300-bb08-e0d3ce05f797-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.860179 4822 generic.go:334] "Generic (PLEG): container finished" podID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerID="df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f" exitCode=0 Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.860266 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg4p2" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.860269 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerDied","Data":"df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f"} Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.860510 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg4p2" event={"ID":"25aa03be-1c23-4300-bb08-e0d3ce05f797","Type":"ContainerDied","Data":"0af751aba3978779681e3c53b7c461a77f2e3dd1a6f991c622b3c5b7b54517c6"} Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.860558 4822 scope.go:117] "RemoveContainer" containerID="df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.898588 4822 scope.go:117] "RemoveContainer" containerID="928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550" Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.902588 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bg4p2"] Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.925121 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bg4p2"] Oct 10 09:12:49 crc kubenswrapper[4822]: I1010 09:12:49.948263 4822 scope.go:117] "RemoveContainer" containerID="589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.010143 4822 scope.go:117] "RemoveContainer" containerID="df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f" Oct 10 09:12:50 crc kubenswrapper[4822]: E1010 09:12:50.011027 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f\": container with ID starting with df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f not found: ID does not exist" containerID="df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.011090 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f"} err="failed to get container status \"df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f\": rpc error: code = NotFound desc = could not find container \"df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f\": container with ID starting with df9b5f1dde328bae614e213e5dd3fb1e888d92f26c3f0459afd38925bf7d884f not found: ID does not exist" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.011123 4822 scope.go:117] "RemoveContainer" containerID="928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550" Oct 10 09:12:50 crc kubenswrapper[4822]: E1010 09:12:50.013220 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550\": container with ID starting with 928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550 not found: ID does not exist" containerID="928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.013294 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550"} err="failed to get container status \"928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550\": rpc error: code = NotFound desc = could not find container \"928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550\": container with ID starting with 928d2ea04ba4cda35125f7eb7038ded17ba594cbc75ee7ff50ca16a084800550 not found: ID does not exist" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.013341 4822 scope.go:117] "RemoveContainer" containerID="589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0" Oct 10 09:12:50 crc kubenswrapper[4822]: E1010 09:12:50.014038 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0\": container with ID starting with 589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0 not found: ID does not exist" containerID="589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.014082 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0"} err="failed to get container status \"589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0\": rpc error: code = NotFound desc = could not find container \"589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0\": container with ID starting with 589e0e8a0f8cd61579ab812ece4b602eee84d85bab61a4f95977a8a1cd5392d0 not found: ID does not exist" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.292915 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lj4jx/must-gather-v8fr6"] Oct 10 09:12:50 crc kubenswrapper[4822]: E1010 09:12:50.293507 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="extract-content" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.293524 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="extract-content" Oct 10 09:12:50 crc kubenswrapper[4822]: E1010 09:12:50.293557 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="extract-utilities" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.293564 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="extract-utilities" Oct 10 09:12:50 crc kubenswrapper[4822]: E1010 09:12:50.293585 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="registry-server" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.293590 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="registry-server" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.293760 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" containerName="registry-server" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.295440 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.297158 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lj4jx"/"default-dockercfg-6hgz8" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.297520 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lj4jx"/"kube-root-ca.crt" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.306483 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lj4jx"/"openshift-service-ca.crt" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.313642 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj4jx/must-gather-v8fr6"] Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.378858 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0673d647-f8a4-43da-b9ee-019099d7fa5d-must-gather-output\") pod \"must-gather-v8fr6\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.379071 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c79k9\" (UniqueName: \"kubernetes.io/projected/0673d647-f8a4-43da-b9ee-019099d7fa5d-kube-api-access-c79k9\") pod \"must-gather-v8fr6\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.481100 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0673d647-f8a4-43da-b9ee-019099d7fa5d-must-gather-output\") pod \"must-gather-v8fr6\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.481160 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c79k9\" (UniqueName: \"kubernetes.io/projected/0673d647-f8a4-43da-b9ee-019099d7fa5d-kube-api-access-c79k9\") pod \"must-gather-v8fr6\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.481701 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0673d647-f8a4-43da-b9ee-019099d7fa5d-must-gather-output\") pod \"must-gather-v8fr6\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.498509 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c79k9\" (UniqueName: \"kubernetes.io/projected/0673d647-f8a4-43da-b9ee-019099d7fa5d-kube-api-access-c79k9\") pod \"must-gather-v8fr6\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:50 crc kubenswrapper[4822]: I1010 09:12:50.615536 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:12:51 crc kubenswrapper[4822]: I1010 09:12:51.123061 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj4jx/must-gather-v8fr6"] Oct 10 09:12:51 crc kubenswrapper[4822]: W1010 09:12:51.132775 4822 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0673d647_f8a4_43da_b9ee_019099d7fa5d.slice/crio-8572cb3e101379ee0196c343597928c9d2e3b3ba4702312d4bb2611d26cc8347 WatchSource:0}: Error finding container 8572cb3e101379ee0196c343597928c9d2e3b3ba4702312d4bb2611d26cc8347: Status 404 returned error can't find the container with id 8572cb3e101379ee0196c343597928c9d2e3b3ba4702312d4bb2611d26cc8347 Oct 10 09:12:51 crc kubenswrapper[4822]: I1010 09:12:51.665104 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25aa03be-1c23-4300-bb08-e0d3ce05f797" path="/var/lib/kubelet/pods/25aa03be-1c23-4300-bb08-e0d3ce05f797/volumes" Oct 10 09:12:51 crc kubenswrapper[4822]: I1010 09:12:51.897137 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" event={"ID":"0673d647-f8a4-43da-b9ee-019099d7fa5d","Type":"ContainerStarted","Data":"8572cb3e101379ee0196c343597928c9d2e3b3ba4702312d4bb2611d26cc8347"} Oct 10 09:12:55 crc kubenswrapper[4822]: I1010 09:12:55.937050 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" event={"ID":"0673d647-f8a4-43da-b9ee-019099d7fa5d","Type":"ContainerStarted","Data":"170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea"} Oct 10 09:12:56 crc kubenswrapper[4822]: I1010 09:12:56.949285 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" event={"ID":"0673d647-f8a4-43da-b9ee-019099d7fa5d","Type":"ContainerStarted","Data":"b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb"} Oct 10 09:12:56 crc kubenswrapper[4822]: I1010 09:12:56.979785 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" podStartSLOduration=2.749219116 podStartE2EDuration="6.979763942s" podCreationTimestamp="2025-10-10 09:12:50 +0000 UTC" firstStartedPulling="2025-10-10 09:12:51.136163676 +0000 UTC m=+10118.231321862" lastFinishedPulling="2025-10-10 09:12:55.366708492 +0000 UTC m=+10122.461866688" observedRunningTime="2025-10-10 09:12:56.966321735 +0000 UTC m=+10124.061479991" watchObservedRunningTime="2025-10-10 09:12:56.979763942 +0000 UTC m=+10124.074922148" Oct 10 09:12:58 crc kubenswrapper[4822]: I1010 09:12:58.651275 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:12:58 crc kubenswrapper[4822]: E1010 09:12:58.651835 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.518493 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lj4jx/crc-debug-bjn2w"] Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.520773 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.639277 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xtw\" (UniqueName: \"kubernetes.io/projected/c5eb0de7-353d-42e6-9046-afcc982fd3c7-kube-api-access-79xtw\") pod \"crc-debug-bjn2w\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.639832 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5eb0de7-353d-42e6-9046-afcc982fd3c7-host\") pod \"crc-debug-bjn2w\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.741665 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5eb0de7-353d-42e6-9046-afcc982fd3c7-host\") pod \"crc-debug-bjn2w\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.741787 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xtw\" (UniqueName: \"kubernetes.io/projected/c5eb0de7-353d-42e6-9046-afcc982fd3c7-kube-api-access-79xtw\") pod \"crc-debug-bjn2w\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.741830 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5eb0de7-353d-42e6-9046-afcc982fd3c7-host\") pod \"crc-debug-bjn2w\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.766613 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xtw\" (UniqueName: \"kubernetes.io/projected/c5eb0de7-353d-42e6-9046-afcc982fd3c7-kube-api-access-79xtw\") pod \"crc-debug-bjn2w\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:00 crc kubenswrapper[4822]: I1010 09:13:00.845852 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:02 crc kubenswrapper[4822]: I1010 09:13:02.009628 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" event={"ID":"c5eb0de7-353d-42e6-9046-afcc982fd3c7","Type":"ContainerStarted","Data":"1387dcec33d757f8554ebd50b2cd29cf7ed15552d792cec229e9e781f5f55580"} Oct 10 09:13:10 crc kubenswrapper[4822]: I1010 09:13:10.650548 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:13:10 crc kubenswrapper[4822]: E1010 09:13:10.651232 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:13:18 crc kubenswrapper[4822]: I1010 09:13:18.193584 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" event={"ID":"c5eb0de7-353d-42e6-9046-afcc982fd3c7","Type":"ContainerStarted","Data":"3f56ff33be5c85001d3cf546d186c8d068bba6cc16669c37d6172e6d1c185c7c"} Oct 10 09:13:18 crc kubenswrapper[4822]: I1010 09:13:18.216459 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" podStartSLOduration=2.39560893 podStartE2EDuration="18.216439068s" podCreationTimestamp="2025-10-10 09:13:00 +0000 UTC" firstStartedPulling="2025-10-10 09:13:01.123874999 +0000 UTC m=+10128.219033195" lastFinishedPulling="2025-10-10 09:13:16.944705137 +0000 UTC m=+10144.039863333" observedRunningTime="2025-10-10 09:13:18.207626684 +0000 UTC m=+10145.302784870" watchObservedRunningTime="2025-10-10 09:13:18.216439068 +0000 UTC m=+10145.311597264" Oct 10 09:13:25 crc kubenswrapper[4822]: I1010 09:13:25.652024 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:13:25 crc kubenswrapper[4822]: E1010 09:13:25.652949 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:13:38 crc kubenswrapper[4822]: I1010 09:13:38.650605 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:13:38 crc kubenswrapper[4822]: E1010 09:13:38.651560 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:13:50 crc kubenswrapper[4822]: I1010 09:13:50.650762 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:13:50 crc kubenswrapper[4822]: E1010 09:13:50.652998 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:13:52 crc kubenswrapper[4822]: I1010 09:13:52.535039 4822 generic.go:334] "Generic (PLEG): container finished" podID="c5eb0de7-353d-42e6-9046-afcc982fd3c7" containerID="3f56ff33be5c85001d3cf546d186c8d068bba6cc16669c37d6172e6d1c185c7c" exitCode=0 Oct 10 09:13:52 crc kubenswrapper[4822]: I1010 09:13:52.535166 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" event={"ID":"c5eb0de7-353d-42e6-9046-afcc982fd3c7","Type":"ContainerDied","Data":"3f56ff33be5c85001d3cf546d186c8d068bba6cc16669c37d6172e6d1c185c7c"} Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.675748 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.711100 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lj4jx/crc-debug-bjn2w"] Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.721483 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lj4jx/crc-debug-bjn2w"] Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.792058 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79xtw\" (UniqueName: \"kubernetes.io/projected/c5eb0de7-353d-42e6-9046-afcc982fd3c7-kube-api-access-79xtw\") pod \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.792336 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5eb0de7-353d-42e6-9046-afcc982fd3c7-host\") pod \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\" (UID: \"c5eb0de7-353d-42e6-9046-afcc982fd3c7\") " Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.794584 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5eb0de7-353d-42e6-9046-afcc982fd3c7-host" (OuterVolumeSpecName: "host") pod "c5eb0de7-353d-42e6-9046-afcc982fd3c7" (UID: "c5eb0de7-353d-42e6-9046-afcc982fd3c7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.798550 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5eb0de7-353d-42e6-9046-afcc982fd3c7-kube-api-access-79xtw" (OuterVolumeSpecName: "kube-api-access-79xtw") pod "c5eb0de7-353d-42e6-9046-afcc982fd3c7" (UID: "c5eb0de7-353d-42e6-9046-afcc982fd3c7"). InnerVolumeSpecName "kube-api-access-79xtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.894755 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79xtw\" (UniqueName: \"kubernetes.io/projected/c5eb0de7-353d-42e6-9046-afcc982fd3c7-kube-api-access-79xtw\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:53 crc kubenswrapper[4822]: I1010 09:13:53.894817 4822 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5eb0de7-353d-42e6-9046-afcc982fd3c7-host\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:54 crc kubenswrapper[4822]: I1010 09:13:54.558115 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1387dcec33d757f8554ebd50b2cd29cf7ed15552d792cec229e9e781f5f55580" Oct 10 09:13:54 crc kubenswrapper[4822]: I1010 09:13:54.558179 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-bjn2w" Oct 10 09:13:54 crc kubenswrapper[4822]: I1010 09:13:54.891921 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lj4jx/crc-debug-5mtxf"] Oct 10 09:13:54 crc kubenswrapper[4822]: E1010 09:13:54.892846 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eb0de7-353d-42e6-9046-afcc982fd3c7" containerName="container-00" Oct 10 09:13:54 crc kubenswrapper[4822]: I1010 09:13:54.892864 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eb0de7-353d-42e6-9046-afcc982fd3c7" containerName="container-00" Oct 10 09:13:54 crc kubenswrapper[4822]: I1010 09:13:54.893161 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5eb0de7-353d-42e6-9046-afcc982fd3c7" containerName="container-00" Oct 10 09:13:54 crc kubenswrapper[4822]: I1010 09:13:54.894064 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.019224 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnn78\" (UniqueName: \"kubernetes.io/projected/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-kube-api-access-rnn78\") pod \"crc-debug-5mtxf\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.019320 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-host\") pod \"crc-debug-5mtxf\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.121009 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnn78\" (UniqueName: \"kubernetes.io/projected/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-kube-api-access-rnn78\") pod \"crc-debug-5mtxf\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.121133 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-host\") pod \"crc-debug-5mtxf\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.121235 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-host\") pod \"crc-debug-5mtxf\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.577056 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnn78\" (UniqueName: \"kubernetes.io/projected/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-kube-api-access-rnn78\") pod \"crc-debug-5mtxf\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.668641 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5eb0de7-353d-42e6-9046-afcc982fd3c7" path="/var/lib/kubelet/pods/c5eb0de7-353d-42e6-9046-afcc982fd3c7/volumes" Oct 10 09:13:55 crc kubenswrapper[4822]: I1010 09:13:55.812166 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:56 crc kubenswrapper[4822]: I1010 09:13:56.582707 4822 generic.go:334] "Generic (PLEG): container finished" podID="8cefaf3d-b7b8-4908-8c26-0f0ec1265482" containerID="8e84ae86836f0977180aaba86573c22a128ce78a6bc2bef502c88901ef455a69" exitCode=1 Oct 10 09:13:56 crc kubenswrapper[4822]: I1010 09:13:56.582777 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" event={"ID":"8cefaf3d-b7b8-4908-8c26-0f0ec1265482","Type":"ContainerDied","Data":"8e84ae86836f0977180aaba86573c22a128ce78a6bc2bef502c88901ef455a69"} Oct 10 09:13:56 crc kubenswrapper[4822]: I1010 09:13:56.583208 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" event={"ID":"8cefaf3d-b7b8-4908-8c26-0f0ec1265482","Type":"ContainerStarted","Data":"d2f2186e55f35fba7f94a1cb1028c151958b726a044c70f80ad3058b306a9fc6"} Oct 10 09:13:56 crc kubenswrapper[4822]: I1010 09:13:56.639593 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lj4jx/crc-debug-5mtxf"] Oct 10 09:13:56 crc kubenswrapper[4822]: I1010 09:13:56.660039 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lj4jx/crc-debug-5mtxf"] Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.147562 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.295568 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnn78\" (UniqueName: \"kubernetes.io/projected/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-kube-api-access-rnn78\") pod \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.295756 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-host\") pod \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\" (UID: \"8cefaf3d-b7b8-4908-8c26-0f0ec1265482\") " Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.296131 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-host" (OuterVolumeSpecName: "host") pod "8cefaf3d-b7b8-4908-8c26-0f0ec1265482" (UID: "8cefaf3d-b7b8-4908-8c26-0f0ec1265482"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.296702 4822 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-host\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.304325 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-kube-api-access-rnn78" (OuterVolumeSpecName: "kube-api-access-rnn78") pod "8cefaf3d-b7b8-4908-8c26-0f0ec1265482" (UID: "8cefaf3d-b7b8-4908-8c26-0f0ec1265482"). InnerVolumeSpecName "kube-api-access-rnn78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.398707 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnn78\" (UniqueName: \"kubernetes.io/projected/8cefaf3d-b7b8-4908-8c26-0f0ec1265482-kube-api-access-rnn78\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.617718 4822 scope.go:117] "RemoveContainer" containerID="8e84ae86836f0977180aaba86573c22a128ce78a6bc2bef502c88901ef455a69" Oct 10 09:13:58 crc kubenswrapper[4822]: I1010 09:13:58.617769 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/crc-debug-5mtxf" Oct 10 09:13:59 crc kubenswrapper[4822]: I1010 09:13:59.670538 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cefaf3d-b7b8-4908-8c26-0f0ec1265482" path="/var/lib/kubelet/pods/8cefaf3d-b7b8-4908-8c26-0f0ec1265482/volumes" Oct 10 09:14:05 crc kubenswrapper[4822]: I1010 09:14:05.650223 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:14:06 crc kubenswrapper[4822]: I1010 09:14:06.726880 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"aac138ea0c0a0c7046a707fa2212806ddc6545488f8b72f58a6bc2b4d54d5d31"} Oct 10 09:14:31 crc kubenswrapper[4822]: I1010 09:14:31.530612 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a497eae8-7b84-4b5e-9916-2d07ccef9712/init-config-reloader/0.log" Oct 10 09:14:31 crc kubenswrapper[4822]: I1010 09:14:31.750233 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a497eae8-7b84-4b5e-9916-2d07ccef9712/init-config-reloader/0.log" Oct 10 09:14:31 crc kubenswrapper[4822]: I1010 09:14:31.791739 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a497eae8-7b84-4b5e-9916-2d07ccef9712/alertmanager/0.log" Oct 10 09:14:31 crc kubenswrapper[4822]: I1010 09:14:31.828549 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_a497eae8-7b84-4b5e-9916-2d07ccef9712/config-reloader/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.083292 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67/aodh-api/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.120738 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67/aodh-evaluator/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.234409 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67/aodh-listener/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.289496 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b9375ccb-cd2d-4e1f-b0a5-5c73e8debc67/aodh-notifier/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.432961 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8db6f474-rp2qt_2a4e919a-211c-401c-a620-ad1ca22ce280/barbican-api/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.511722 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8db6f474-rp2qt_2a4e919a-211c-401c-a620-ad1ca22ce280/barbican-api-log/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.635134 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-85d4945d7b-2j8vx_9ba15f45-5531-4b65-bbfc-55a051cda9a7/barbican-keystone-listener/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.693688 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-85d4945d7b-2j8vx_9ba15f45-5531-4b65-bbfc-55a051cda9a7/barbican-keystone-listener-log/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.820530 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7ff764c9-vhb8b_08f2317b-357b-4060-92f2-13ed8d69c226/barbican-worker/0.log" Oct 10 09:14:32 crc kubenswrapper[4822]: I1010 09:14:32.899059 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7ff764c9-vhb8b_08f2317b-357b-4060-92f2-13ed8d69c226/barbican-worker-log/0.log" Oct 10 09:14:33 crc kubenswrapper[4822]: I1010 09:14:33.091599 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-zxpzn_35c21f6c-c1c3-4191-b721-ac63a25495ab/bootstrap-openstack-openstack-cell1/0.log" Oct 10 09:14:33 crc kubenswrapper[4822]: I1010 09:14:33.853640 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_50a32b63-179a-49ea-a6b9-badbc0008449/proxy-httpd/0.log" Oct 10 09:14:33 crc kubenswrapper[4822]: I1010 09:14:33.883119 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_50a32b63-179a-49ea-a6b9-badbc0008449/ceilometer-notification-agent/0.log" Oct 10 09:14:33 crc kubenswrapper[4822]: I1010 09:14:33.910359 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_50a32b63-179a-49ea-a6b9-badbc0008449/ceilometer-central-agent/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.053556 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_50a32b63-179a-49ea-a6b9-badbc0008449/sg-core/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.194450 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-b427v_2c56d568-594f-4562-bfd2-db0617239e9c/ceph-client-openstack-openstack-cell1/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.444075 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1c5d5896-70d2-4754-9d84-a5c6128ba3c5/cinder-api-log/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.464901 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1c5d5896-70d2-4754-9d84-a5c6128ba3c5/cinder-api/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.671127 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6e280c73-7378-4070-8375-4ca5f421790a/probe/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.777642 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6e280c73-7378-4070-8375-4ca5f421790a/cinder-backup/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.875539 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89d26112-eba6-439d-a781-ead7c951a525/cinder-scheduler/0.log" Oct 10 09:14:34 crc kubenswrapper[4822]: I1010 09:14:34.928494 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89d26112-eba6-439d-a781-ead7c951a525/probe/0.log" Oct 10 09:14:35 crc kubenswrapper[4822]: I1010 09:14:35.836836 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4dc37285-4d96-4e82-97ab-e162c50877dc/probe/0.log" Oct 10 09:14:35 crc kubenswrapper[4822]: I1010 09:14:35.844867 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4dc37285-4d96-4e82-97ab-e162c50877dc/cinder-volume/0.log" Oct 10 09:14:35 crc kubenswrapper[4822]: I1010 09:14:35.853034 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-7bp7k_e42ac6ea-b249-45f2-a5fa-fdb828736e26/configure-network-openstack-openstack-cell1/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.084028 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-k6sr2_ff18ff5e-22b5-464b-a2f0-f879bc31db11/configure-os-openstack-openstack-cell1/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.201576 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-56c69776bc-xvf4j_a8abe278-5814-4951-afa4-3e2c66259376/init/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.406592 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-56c69776bc-xvf4j_a8abe278-5814-4951-afa4-3e2c66259376/init/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.432327 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-rsv2k_7482993c-2c6c-4ad2-9891-c6eaf07b76e3/download-cache-openstack-openstack-cell1/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.451429 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-56c69776bc-xvf4j_a8abe278-5814-4951-afa4-3e2c66259376/dnsmasq-dns/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.609407 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_56faf384-52ff-4e12-ab58-6ced8cf71bc0/glance-httpd/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.622250 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_56faf384-52ff-4e12-ab58-6ced8cf71bc0/glance-log/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.793629 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_10e77277-2b4e-4b63-8596-ca5185866f05/glance-log/0.log" Oct 10 09:14:36 crc kubenswrapper[4822]: I1010 09:14:36.817043 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_10e77277-2b4e-4b63-8596-ca5185866f05/glance-httpd/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.071534 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7cb6cbddc4-rj72c_cbb3bc3e-e7d8-4ef3-9e45-2cbeedb031f6/heat-api/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.148032 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-57469fb88c-jkzr5_bec9bcc5-65a6-4a3c-b3db-17ba53221d41/heat-cfnapi/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.296034 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-c46674d4f-g27ln_85c4f605-9277-4ab5-95c8-4c7e1a2e94b1/heat-engine/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.452979 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c5d44449-hr4bs_5228af92-c9c7-494f-957f-0e63f41ca0eb/horizon/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.456386 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c5d44449-hr4bs_5228af92-c9c7-494f-957f-0e63f41ca0eb/horizon-log/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.526487 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-c4wdc_628e74c8-1d98-4722-906f-588c48bea3fc/install-certs-openstack-openstack-cell1/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.660970 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-jbpgq_3ff8c46f-3adf-4c25-8ab2-de6920f49542/install-os-openstack-openstack-cell1/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.927700 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5fbf965899-hwsfk_dd9d8f27-6be6-48d7-836b-a3bc2594abe3/keystone-api/0.log" Oct 10 09:14:37 crc kubenswrapper[4822]: I1010 09:14:37.939118 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29334721-nzg8p_fd700c4a-f515-420b-876b-6875148a7725/keystone-cron/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.103546 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29334781-c8lnk_33c77dae-c2ca-4c36-8d42-11478062ad05/keystone-cron/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.158207 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_60b95e5c-694d-4b82-b3db-2aa33ebd7189/kube-state-metrics/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.347350 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-9slgf_d85b9ab1-c113-48ef-9875-ba3ebf6427f9/libvirt-openstack-openstack-cell1/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.481126 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_10f6deb2-9123-429f-b3a6-7febb7f832e2/manila-api/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.526934 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_10f6deb2-9123-429f-b3a6-7febb7f832e2/manila-api-log/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.665720 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5c51bb29-a67c-42a9-8243-6a8281745cc0/manila-scheduler/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.713672 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5c51bb29-a67c-42a9-8243-6a8281745cc0/probe/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.831697 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_694e4feb-bdf7-42c7-b0d1-7e5adb7a0444/probe/0.log" Oct 10 09:14:38 crc kubenswrapper[4822]: I1010 09:14:38.867072 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_694e4feb-bdf7-42c7-b0d1-7e5adb7a0444/manila-share/0.log" Oct 10 09:14:39 crc kubenswrapper[4822]: I1010 09:14:39.248906 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7686cd5d9-vc6db_e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0/neutron-api/0.log" Oct 10 09:14:39 crc kubenswrapper[4822]: I1010 09:14:39.250035 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7686cd5d9-vc6db_e5e9e3c2-e7e6-45a2-a6eb-7f9c9d0939d0/neutron-httpd/0.log" Oct 10 09:14:39 crc kubenswrapper[4822]: I1010 09:14:39.593410 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-jb6k4_718c7cdd-089d-42bc-8cab-74214e8ddeb3/neutron-dhcp-openstack-openstack-cell1/0.log" Oct 10 09:14:39 crc kubenswrapper[4822]: I1010 09:14:39.668331 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-lr864_06e3062b-6e37-4323-97a4-03c9dda66887/neutron-metadata-openstack-openstack-cell1/0.log" Oct 10 09:14:39 crc kubenswrapper[4822]: I1010 09:14:39.870665 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-n9tvx_9d03d865-d40b-44f1-adf6-bc451617f98a/neutron-sriov-openstack-openstack-cell1/0.log" Oct 10 09:14:40 crc kubenswrapper[4822]: I1010 09:14:40.113264 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_706de7f2-6f04-48b5-9510-aed6e16b14e1/nova-api-api/0.log" Oct 10 09:14:40 crc kubenswrapper[4822]: I1010 09:14:40.258894 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_706de7f2-6f04-48b5-9510-aed6e16b14e1/nova-api-log/0.log" Oct 10 09:14:40 crc kubenswrapper[4822]: I1010 09:14:40.422296 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_26937e57-4862-4167-b627-9711868b3b60/nova-cell0-conductor-conductor/0.log" Oct 10 09:14:40 crc kubenswrapper[4822]: I1010 09:14:40.534262 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a43abc0e-2683-43f5-ab5e-5687c8ecd71b/nova-cell1-conductor-conductor/0.log" Oct 10 09:14:40 crc kubenswrapper[4822]: I1010 09:14:40.723768 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7e555555-d99c-4a6f-af86-74539d7163e4/nova-cell1-novncproxy-novncproxy/0.log" Oct 10 09:14:41 crc kubenswrapper[4822]: I1010 09:14:41.312935 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfqcrl_1992f703-ffd6-47f1-ad78-44790cb9e0be/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Oct 10 09:14:41 crc kubenswrapper[4822]: I1010 09:14:41.375712 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-frjzw_afd1eb20-744e-41a9-84b0-0dfb89dc1cea/nova-cell1-openstack-openstack-cell1/0.log" Oct 10 09:14:41 crc kubenswrapper[4822]: I1010 09:14:41.577692 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a21f845d-863d-4fb5-9303-13710403c771/nova-metadata-log/0.log" Oct 10 09:14:41 crc kubenswrapper[4822]: I1010 09:14:41.607584 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a21f845d-863d-4fb5-9303-13710403c771/nova-metadata-metadata/0.log" Oct 10 09:14:41 crc kubenswrapper[4822]: I1010 09:14:41.873218 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fb0c61d8-b919-4131-a03a-6fe380018721/nova-scheduler-scheduler/0.log" Oct 10 09:14:41 crc kubenswrapper[4822]: I1010 09:14:41.927991 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bf4dc558d-pljb9_14451cd9-53f4-44f9-987c-d7764a65543d/init/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.172064 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bf4dc558d-pljb9_14451cd9-53f4-44f9-987c-d7764a65543d/init/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.324288 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bf4dc558d-pljb9_14451cd9-53f4-44f9-987c-d7764a65543d/octavia-api-provider-agent/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.334640 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_661709bf-a225-4ecb-a2ce-245c8dd7af77/memcached/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.405384 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-bf4dc558d-pljb9_14451cd9-53f4-44f9-987c-d7764a65543d/octavia-api/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.526586 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-cbvgt_17cf8698-d96c-44da-b74d-8c1986940707/init/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.710318 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-jqvqw_35e6cf62-30fb-4506-be4f-1e15c90cfaf1/init/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.728820 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-cbvgt_17cf8698-d96c-44da-b74d-8c1986940707/init/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.789125 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-cbvgt_17cf8698-d96c-44da-b74d-8c1986940707/octavia-healthmanager/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.881982 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-jqvqw_35e6cf62-30fb-4506-be4f-1e15c90cfaf1/init/0.log" Oct 10 09:14:42 crc kubenswrapper[4822]: I1010 09:14:42.946046 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-jqvqw_35e6cf62-30fb-4506-be4f-1e15c90cfaf1/octavia-housekeeping/0.log" Oct 10 09:14:43 crc kubenswrapper[4822]: I1010 09:14:43.008204 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-vhlx4_635e9734-7d78-4fff-8491-37c1fc8b69e1/init/0.log" Oct 10 09:14:43 crc kubenswrapper[4822]: I1010 09:14:43.139183 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-vhlx4_635e9734-7d78-4fff-8491-37c1fc8b69e1/octavia-amphora-httpd/0.log" Oct 10 09:14:43 crc kubenswrapper[4822]: I1010 09:14:43.161326 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-vhlx4_635e9734-7d78-4fff-8491-37c1fc8b69e1/init/0.log" Oct 10 09:14:43 crc kubenswrapper[4822]: I1010 09:14:43.190729 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wcrxk_e63af2a0-2c50-4576-971d-b276062144d6/init/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.091877 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wcrxk_e63af2a0-2c50-4576-971d-b276062144d6/octavia-rsyslog/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.111978 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wcrxk_e63af2a0-2c50-4576-971d-b276062144d6/init/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.115928 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5tqgz_416b6683-2b79-4be3-bb3b-4ece64ea85c4/init/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.339689 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5tqgz_416b6683-2b79-4be3-bb3b-4ece64ea85c4/init/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.393708 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_123c29f8-7721-41ab-81f1-887769d9e1c2/mysql-bootstrap/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.399973 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5tqgz_416b6683-2b79-4be3-bb3b-4ece64ea85c4/octavia-worker/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.581561 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_123c29f8-7721-41ab-81f1-887769d9e1c2/mysql-bootstrap/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.588120 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_123c29f8-7721-41ab-81f1-887769d9e1c2/galera/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.736728 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cbd2b8da-ad61-4a22-be7c-5639531463de/mysql-bootstrap/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.873007 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cbd2b8da-ad61-4a22-be7c-5639531463de/mysql-bootstrap/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.896890 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_cbd2b8da-ad61-4a22-be7c-5639531463de/galera/0.log" Oct 10 09:14:44 crc kubenswrapper[4822]: I1010 09:14:44.945477 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a3817196-a9d9-404a-88be-36429ce51c70/openstackclient/0.log" Oct 10 09:14:45 crc kubenswrapper[4822]: I1010 09:14:45.094589 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-82prw_c00f6312-7e6e-4afd-a8c9-000088ad9fb4/ovn-controller/0.log" Oct 10 09:14:45 crc kubenswrapper[4822]: I1010 09:14:45.119963 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pzzql_8f99b4ad-8bf1-436c-a2a4-ed8064abf4ad/openstack-network-exporter/0.log" Oct 10 09:14:45 crc kubenswrapper[4822]: I1010 09:14:45.954454 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bckd7_80ce8e7a-d332-4f3c-a8ec-30052721c927/ovsdb-server-init/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.144062 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bckd7_80ce8e7a-d332-4f3c-a8ec-30052721c927/ovsdb-server-init/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.196901 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bckd7_80ce8e7a-d332-4f3c-a8ec-30052721c927/ovsdb-server/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.217247 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bckd7_80ce8e7a-d332-4f3c-a8ec-30052721c927/ovs-vswitchd/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.383863 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_438d0867-b9d5-4195-bff2-6f70a239db66/ovn-northd/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.389718 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_438d0867-b9d5-4195-bff2-6f70a239db66/openstack-network-exporter/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.618524 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-m7m8g_9008d029-8eb9-482b-bd9a-fd5bc2259bc1/ovn-openstack-openstack-cell1/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.650717 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2239595c-2d41-4737-8496-a42a7215fdcc/openstack-network-exporter/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.766028 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2239595c-2d41-4737-8496-a42a7215fdcc/ovsdbserver-nb/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.827971 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_45235ece-2520-44af-bb2d-2083eaa25753/openstack-network-exporter/0.log" Oct 10 09:14:46 crc kubenswrapper[4822]: I1010 09:14:46.886211 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_45235ece-2520-44af-bb2d-2083eaa25753/ovsdbserver-nb/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.007261 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_92f35517-5623-47ab-b0b4-b9e9d9be4a9e/openstack-network-exporter/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.129234 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_92f35517-5623-47ab-b0b4-b9e9d9be4a9e/ovsdbserver-nb/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.175677 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c74b491-d30e-4d3a-8dc8-532b38928ef6/openstack-network-exporter/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.251377 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c74b491-d30e-4d3a-8dc8-532b38928ef6/ovsdbserver-sb/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.312365 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_d5280bf6-7247-4221-a89b-b6a8a7c6a425/openstack-network-exporter/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.370340 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_d5280bf6-7247-4221-a89b-b6a8a7c6a425/ovsdbserver-sb/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.477714 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_4a6a4a40-9317-4238-b106-7f2b1f900ad3/openstack-network-exporter/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.523073 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_4a6a4a40-9317-4238-b106-7f2b1f900ad3/ovsdbserver-sb/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.694575 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c56b55c6d-mgk7x_1dc18be1-e71e-4d36-ba14-39c872d97771/placement-api/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.771416 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c56b55c6d-mgk7x_1dc18be1-e71e-4d36-ba14-39c872d97771/placement-log/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.804568 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c5286f_352cc9ff-7a6d-4e9e-b662-5051df6b3be7/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 10 09:14:47 crc kubenswrapper[4822]: I1010 09:14:47.950744 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_600d3d37-326b-432d-8fb9-1eab04ab53e9/init-config-reloader/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.116734 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_600d3d37-326b-432d-8fb9-1eab04ab53e9/init-config-reloader/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.119320 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_600d3d37-326b-432d-8fb9-1eab04ab53e9/config-reloader/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.132173 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_600d3d37-326b-432d-8fb9-1eab04ab53e9/thanos-sidecar/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.140825 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_600d3d37-326b-432d-8fb9-1eab04ab53e9/prometheus/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.300876 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3e8c0f60-e5ce-4c39-bf4b-7a9483088159/setup-container/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.472853 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3e8c0f60-e5ce-4c39-bf4b-7a9483088159/setup-container/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.530486 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6122a003-1c32-48ac-a2f8-902b898977ca/setup-container/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.546725 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3e8c0f60-e5ce-4c39-bf4b-7a9483088159/rabbitmq/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.785282 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6122a003-1c32-48ac-a2f8-902b898977ca/setup-container/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.806379 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-962c2_71cdd189-590b-495e-b841-83ab3662979f/reboot-os-openstack-openstack-cell1/0.log" Oct 10 09:14:48 crc kubenswrapper[4822]: I1010 09:14:48.828614 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6122a003-1c32-48ac-a2f8-902b898977ca/rabbitmq/0.log" Oct 10 09:14:49 crc kubenswrapper[4822]: I1010 09:14:49.004245 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-ttd5s_6114ecb9-28a8-4e70-96a1-ed43697c60b8/run-os-openstack-openstack-cell1/0.log" Oct 10 09:14:49 crc kubenswrapper[4822]: I1010 09:14:49.042161 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-8tnqf_43ed3e6d-8ebe-4534-9db2-d84e95cf0748/ssh-known-hosts-openstack/0.log" Oct 10 09:14:49 crc kubenswrapper[4822]: I1010 09:14:49.334391 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-8tcnv_94266217-0cc8-4e5c-a9bd-155671c58a19/telemetry-openstack-openstack-cell1/0.log" Oct 10 09:14:49 crc kubenswrapper[4822]: I1010 09:14:49.388635 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-rwtkl_cb5a26a4-8ae3-4b96-a905-70be164e9198/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 10 09:14:49 crc kubenswrapper[4822]: I1010 09:14:49.467177 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-fl6w8_f4b5919e-eeeb-4649-9ac9-6c18676b2a5b/validate-network-openstack-openstack-cell1/0.log" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.176312 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8"] Oct 10 09:15:00 crc kubenswrapper[4822]: E1010 09:15:00.177343 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cefaf3d-b7b8-4908-8c26-0f0ec1265482" containerName="container-00" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.177355 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cefaf3d-b7b8-4908-8c26-0f0ec1265482" containerName="container-00" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.177580 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cefaf3d-b7b8-4908-8c26-0f0ec1265482" containerName="container-00" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.178399 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.182454 4822 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.182724 4822 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.188249 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8"] Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.188287 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsh5t\" (UniqueName: \"kubernetes.io/projected/7fef64b0-4014-44ba-9f23-5e43dcadd413-kube-api-access-bsh5t\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.188605 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fef64b0-4014-44ba-9f23-5e43dcadd413-secret-volume\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.188788 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fef64b0-4014-44ba-9f23-5e43dcadd413-config-volume\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.302377 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fef64b0-4014-44ba-9f23-5e43dcadd413-secret-volume\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.302451 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fef64b0-4014-44ba-9f23-5e43dcadd413-config-volume\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.302559 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsh5t\" (UniqueName: \"kubernetes.io/projected/7fef64b0-4014-44ba-9f23-5e43dcadd413-kube-api-access-bsh5t\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.303691 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fef64b0-4014-44ba-9f23-5e43dcadd413-config-volume\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.310501 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fef64b0-4014-44ba-9f23-5e43dcadd413-secret-volume\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.320617 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsh5t\" (UniqueName: \"kubernetes.io/projected/7fef64b0-4014-44ba-9f23-5e43dcadd413-kube-api-access-bsh5t\") pod \"collect-profiles-29334795-z6xs8\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:00 crc kubenswrapper[4822]: I1010 09:15:00.503372 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:01 crc kubenswrapper[4822]: I1010 09:15:01.023635 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8"] Oct 10 09:15:01 crc kubenswrapper[4822]: I1010 09:15:01.318134 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" event={"ID":"7fef64b0-4014-44ba-9f23-5e43dcadd413","Type":"ContainerStarted","Data":"36eca719db12ac484861dd615cce0d8d002833ee69a28cb0aff6dcf6f662b8b7"} Oct 10 09:15:01 crc kubenswrapper[4822]: I1010 09:15:01.318177 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" event={"ID":"7fef64b0-4014-44ba-9f23-5e43dcadd413","Type":"ContainerStarted","Data":"53446cefdbdea7a1ab9d6374d5d8abd02604d2895dab35b574945f8ae0066cd1"} Oct 10 09:15:01 crc kubenswrapper[4822]: I1010 09:15:01.344458 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" podStartSLOduration=1.344438313 podStartE2EDuration="1.344438313s" podCreationTimestamp="2025-10-10 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:15:01.33704554 +0000 UTC m=+10248.432203746" watchObservedRunningTime="2025-10-10 09:15:01.344438313 +0000 UTC m=+10248.439596509" Oct 10 09:15:02 crc kubenswrapper[4822]: I1010 09:15:02.328535 4822 generic.go:334] "Generic (PLEG): container finished" podID="7fef64b0-4014-44ba-9f23-5e43dcadd413" containerID="36eca719db12ac484861dd615cce0d8d002833ee69a28cb0aff6dcf6f662b8b7" exitCode=0 Oct 10 09:15:02 crc kubenswrapper[4822]: I1010 09:15:02.328632 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" event={"ID":"7fef64b0-4014-44ba-9f23-5e43dcadd413","Type":"ContainerDied","Data":"36eca719db12ac484861dd615cce0d8d002833ee69a28cb0aff6dcf6f662b8b7"} Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.782684 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.876461 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fef64b0-4014-44ba-9f23-5e43dcadd413-config-volume\") pod \"7fef64b0-4014-44ba-9f23-5e43dcadd413\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.876968 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fef64b0-4014-44ba-9f23-5e43dcadd413-secret-volume\") pod \"7fef64b0-4014-44ba-9f23-5e43dcadd413\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.877132 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsh5t\" (UniqueName: \"kubernetes.io/projected/7fef64b0-4014-44ba-9f23-5e43dcadd413-kube-api-access-bsh5t\") pod \"7fef64b0-4014-44ba-9f23-5e43dcadd413\" (UID: \"7fef64b0-4014-44ba-9f23-5e43dcadd413\") " Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.877542 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fef64b0-4014-44ba-9f23-5e43dcadd413-config-volume" (OuterVolumeSpecName: "config-volume") pod "7fef64b0-4014-44ba-9f23-5e43dcadd413" (UID: "7fef64b0-4014-44ba-9f23-5e43dcadd413"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.878045 4822 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fef64b0-4014-44ba-9f23-5e43dcadd413-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.882359 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fef64b0-4014-44ba-9f23-5e43dcadd413-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7fef64b0-4014-44ba-9f23-5e43dcadd413" (UID: "7fef64b0-4014-44ba-9f23-5e43dcadd413"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.889230 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fef64b0-4014-44ba-9f23-5e43dcadd413-kube-api-access-bsh5t" (OuterVolumeSpecName: "kube-api-access-bsh5t") pod "7fef64b0-4014-44ba-9f23-5e43dcadd413" (UID: "7fef64b0-4014-44ba-9f23-5e43dcadd413"). InnerVolumeSpecName "kube-api-access-bsh5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.980992 4822 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fef64b0-4014-44ba-9f23-5e43dcadd413-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:03 crc kubenswrapper[4822]: I1010 09:15:03.981041 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsh5t\" (UniqueName: \"kubernetes.io/projected/7fef64b0-4014-44ba-9f23-5e43dcadd413-kube-api-access-bsh5t\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:04 crc kubenswrapper[4822]: I1010 09:15:04.353377 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" event={"ID":"7fef64b0-4014-44ba-9f23-5e43dcadd413","Type":"ContainerDied","Data":"53446cefdbdea7a1ab9d6374d5d8abd02604d2895dab35b574945f8ae0066cd1"} Oct 10 09:15:04 crc kubenswrapper[4822]: I1010 09:15:04.353417 4822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53446cefdbdea7a1ab9d6374d5d8abd02604d2895dab35b574945f8ae0066cd1" Oct 10 09:15:04 crc kubenswrapper[4822]: I1010 09:15:04.353479 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-z6xs8" Oct 10 09:15:04 crc kubenswrapper[4822]: I1010 09:15:04.446041 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4"] Oct 10 09:15:04 crc kubenswrapper[4822]: I1010 09:15:04.458141 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-72tc4"] Oct 10 09:15:05 crc kubenswrapper[4822]: I1010 09:15:05.671655 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa2ec76-1a60-4849-98cc-c30e58af2078" path="/var/lib/kubelet/pods/2aa2ec76-1a60-4849-98cc-c30e58af2078/volumes" Oct 10 09:15:40 crc kubenswrapper[4822]: I1010 09:15:40.904085 4822 scope.go:117] "RemoveContainer" containerID="aed808d0613828ad03d66294de6ebd44b089dcdd6c0b1da06589d2c44116e1e1" Oct 10 09:16:28 crc kubenswrapper[4822]: I1010 09:16:28.193595 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-wpwlg_a21a0dcd-c8b2-4ea4-ab7e-edae527ab347/kube-rbac-proxy/0.log" Oct 10 09:16:28 crc kubenswrapper[4822]: I1010 09:16:28.365663 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-2h5rm_e757a212-d95c-4ffc-ae84-ceca5cc56cc2/kube-rbac-proxy/0.log" Oct 10 09:16:28 crc kubenswrapper[4822]: I1010 09:16:28.728055 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/util/0.log" Oct 10 09:16:28 crc kubenswrapper[4822]: I1010 09:16:28.893408 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-2h5rm_e757a212-d95c-4ffc-ae84-ceca5cc56cc2/manager/0.log" Oct 10 09:16:28 crc kubenswrapper[4822]: I1010 09:16:28.895502 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/util/0.log" Oct 10 09:16:28 crc kubenswrapper[4822]: I1010 09:16:28.923016 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-wpwlg_a21a0dcd-c8b2-4ea4-ab7e-edae527ab347/manager/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.024608 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/pull/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.133702 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/pull/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.278212 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/pull/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.345977 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/extract/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.419170 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d787cd4e53d3a887f07ff60991dd85bc87d132a246ab0d066265480101zp2nd_1798b7f5-76e8-4545-a64e-e05a472f0eac/util/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.462490 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-j9ksb_be44c9df-65d4-4a6a-8646-7687f601f6b6/kube-rbac-proxy/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.567947 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-j9ksb_be44c9df-65d4-4a6a-8646-7687f601f6b6/manager/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.672895 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-76qx8_3e087283-f802-4b9c-9f1f-bbca4e30a892/kube-rbac-proxy/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.886512 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-76qx8_3e087283-f802-4b9c-9f1f-bbca4e30a892/manager/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.887382 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-grq55_367aa79b-9342-431b-ade4-a9195844ce4a/kube-rbac-proxy/0.log" Oct 10 09:16:29 crc kubenswrapper[4822]: I1010 09:16:29.910829 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-grq55_367aa79b-9342-431b-ade4-a9195844ce4a/manager/0.log" Oct 10 09:16:30 crc kubenswrapper[4822]: I1010 09:16:30.135166 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-gmgqr_9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8/manager/0.log" Oct 10 09:16:30 crc kubenswrapper[4822]: I1010 09:16:30.144703 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-gmgqr_9b6a24b4-dba7-4deb-b2d0-53fd1153e8e8/kube-rbac-proxy/0.log" Oct 10 09:16:30 crc kubenswrapper[4822]: I1010 09:16:30.226449 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-rtgvd_f3f904c2-4da1-46c2-83c6-2ba18d9ccc50/kube-rbac-proxy/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.008433 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-tbj65_f56e0976-eb7a-4bcf-bde2-016c83567fc6/kube-rbac-proxy/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.054781 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-tbj65_f56e0976-eb7a-4bcf-bde2-016c83567fc6/manager/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.279733 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-rtgvd_f3f904c2-4da1-46c2-83c6-2ba18d9ccc50/manager/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.313475 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-4h8gw_49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42/kube-rbac-proxy/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.337048 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.337325 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.457043 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-4h8gw_49bbd4f9-7e19-4a1a-90c6-b8ad5cf54b42/manager/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.460886 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-2fwqj_6133aeb2-a9e5-4170-a6e1-b562cdb97975/kube-rbac-proxy/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.612965 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-2fwqj_6133aeb2-a9e5-4170-a6e1-b562cdb97975/manager/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.673717 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-dt47l_1ff50152-dd82-48ae-bca4-150c0f892185/kube-rbac-proxy/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.779438 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-dt47l_1ff50152-dd82-48ae-bca4-150c0f892185/manager/0.log" Oct 10 09:16:31 crc kubenswrapper[4822]: I1010 09:16:31.910448 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-rvhj6_8d454f08-e347-4e39-8392-9c5b4a2a8f6b/kube-rbac-proxy/0.log" Oct 10 09:16:32 crc kubenswrapper[4822]: I1010 09:16:32.057107 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-k8fww_111bbcd0-554c-4705-a874-1d3aa399a391/kube-rbac-proxy/0.log" Oct 10 09:16:32 crc kubenswrapper[4822]: I1010 09:16:32.227851 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-rvhj6_8d454f08-e347-4e39-8392-9c5b4a2a8f6b/manager/0.log" Oct 10 09:16:32 crc kubenswrapper[4822]: I1010 09:16:32.403420 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-k8fww_111bbcd0-554c-4705-a874-1d3aa399a391/manager/0.log" Oct 10 09:16:32 crc kubenswrapper[4822]: I1010 09:16:32.990835 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74_04dda440-ebd4-412b-9a18-655a9721229d/kube-rbac-proxy/0.log" Oct 10 09:16:32 crc kubenswrapper[4822]: I1010 09:16:32.992041 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dqjl74_04dda440-ebd4-412b-9a18-655a9721229d/manager/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.030616 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-7d77l_8c66ad3b-7962-4747-97d4-a2c183d25ebc/kube-rbac-proxy/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.085485 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-7d77l_8c66ad3b-7962-4747-97d4-a2c183d25ebc/manager/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.222833 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-58fd854765-9cj6d_5178ccba-ae40-49f5-9fba-6df6b0fbb562/kube-rbac-proxy/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.294237 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8485b86f76-68qqr_9d259a92-f5da-477f-921c-274e9d77cd01/kube-rbac-proxy/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.524222 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7wjsb_f2d285c1-f470-4e43-a470-1bfad25e8ee8/registry-server/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.603709 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8485b86f76-68qqr_9d259a92-f5da-477f-921c-274e9d77cd01/operator/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.698286 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-tb22w_b923e24a-92ce-4c4a-8c26-d4fe2b1563ad/kube-rbac-proxy/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.826702 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-tb22w_b923e24a-92ce-4c4a-8c26-d4fe2b1563ad/manager/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.858616 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-9q6b5_0da053a2-94d4-41de-87a8-d7f2662d9b5b/kube-rbac-proxy/0.log" Oct 10 09:16:33 crc kubenswrapper[4822]: I1010 09:16:33.949836 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-9q6b5_0da053a2-94d4-41de-87a8-d7f2662d9b5b/manager/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.147309 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-glnm8_b617d013-1412-4783-b71f-f3142cf15c35/operator/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.258786 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-b5g5r_9544ed2b-6308-4681-a120-d134ee029ded/kube-rbac-proxy/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.374566 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-2cvmc_d01da9fa-a63b-4496-bea1-37048e323618/kube-rbac-proxy/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.376628 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-b5g5r_9544ed2b-6308-4681-a120-d134ee029ded/manager/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.670845 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-2vmqc_7c2224b9-8bd4-4967-9e24-59ce223b2e0e/kube-rbac-proxy/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.671205 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-2vmqc_7c2224b9-8bd4-4967-9e24-59ce223b2e0e/manager/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.832860 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-2cvmc_d01da9fa-a63b-4496-bea1-37048e323618/manager/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.884030 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-hh26l_57520796-d080-466b-9070-c4cd032ed8ab/kube-rbac-proxy/0.log" Oct 10 09:16:34 crc kubenswrapper[4822]: I1010 09:16:34.920189 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-hh26l_57520796-d080-466b-9070-c4cd032ed8ab/manager/0.log" Oct 10 09:16:35 crc kubenswrapper[4822]: I1010 09:16:35.980928 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-58fd854765-9cj6d_5178ccba-ae40-49f5-9fba-6df6b0fbb562/manager/0.log" Oct 10 09:16:52 crc kubenswrapper[4822]: I1010 09:16:52.402446 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mfnp2_3d322408-d6af-47c5-afe2-995737d9d6e2/control-plane-machine-set-operator/0.log" Oct 10 09:16:53 crc kubenswrapper[4822]: I1010 09:16:53.259950 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h59tt_27d8f9ac-f418-46b8-9f7a-8bfc8dde1755/kube-rbac-proxy/0.log" Oct 10 09:16:53 crc kubenswrapper[4822]: I1010 09:16:53.293275 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h59tt_27d8f9ac-f418-46b8-9f7a-8bfc8dde1755/machine-api-operator/0.log" Oct 10 09:17:01 crc kubenswrapper[4822]: I1010 09:17:01.337031 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:17:01 crc kubenswrapper[4822]: I1010 09:17:01.337627 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:17:07 crc kubenswrapper[4822]: I1010 09:17:07.865201 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-lgjkx_7d9258d7-7df3-4bb8-8190-7dcb1a930744/cert-manager-cainjector/0.log" Oct 10 09:17:07 crc kubenswrapper[4822]: I1010 09:17:07.885961 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-pv7b6_7f2ccfb6-3394-43f7-93a9-ec9d73ea38d0/cert-manager-controller/0.log" Oct 10 09:17:08 crc kubenswrapper[4822]: I1010 09:17:08.056530 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-d8lf4_046b28df-836a-482d-8387-f40aef735dce/cert-manager-webhook/0.log" Oct 10 09:17:20 crc kubenswrapper[4822]: I1010 09:17:20.752108 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-4w4wc_044d3970-563c-4037-8388-21e91330f82c/nmstate-console-plugin/0.log" Oct 10 09:17:20 crc kubenswrapper[4822]: I1010 09:17:20.913695 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6xzcn_e6165414-b0cd-4f52-a9bc-da894b2cf483/kube-rbac-proxy/0.log" Oct 10 09:17:20 crc kubenswrapper[4822]: I1010 09:17:20.916323 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lkp99_6482591b-0943-4d7c-90ac-054449e582ba/nmstate-handler/0.log" Oct 10 09:17:20 crc kubenswrapper[4822]: I1010 09:17:20.948657 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6xzcn_e6165414-b0cd-4f52-a9bc-da894b2cf483/nmstate-metrics/0.log" Oct 10 09:17:21 crc kubenswrapper[4822]: I1010 09:17:21.114390 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-z4s2h_e5c07223-93dd-414b-b8c2-c177e0e8c4e9/nmstate-webhook/0.log" Oct 10 09:17:21 crc kubenswrapper[4822]: I1010 09:17:21.124153 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-mmtq2_7b65d092-89df-47ee-81f0-c48ab056e714/nmstate-operator/0.log" Oct 10 09:17:31 crc kubenswrapper[4822]: I1010 09:17:31.336718 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:17:31 crc kubenswrapper[4822]: I1010 09:17:31.337338 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:17:31 crc kubenswrapper[4822]: I1010 09:17:31.337393 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 09:17:31 crc kubenswrapper[4822]: I1010 09:17:31.338589 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aac138ea0c0a0c7046a707fa2212806ddc6545488f8b72f58a6bc2b4d54d5d31"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:17:31 crc kubenswrapper[4822]: I1010 09:17:31.338689 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://aac138ea0c0a0c7046a707fa2212806ddc6545488f8b72f58a6bc2b4d54d5d31" gracePeriod=600 Oct 10 09:17:32 crc kubenswrapper[4822]: I1010 09:17:32.100586 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="aac138ea0c0a0c7046a707fa2212806ddc6545488f8b72f58a6bc2b4d54d5d31" exitCode=0 Oct 10 09:17:32 crc kubenswrapper[4822]: I1010 09:17:32.100630 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"aac138ea0c0a0c7046a707fa2212806ddc6545488f8b72f58a6bc2b4d54d5d31"} Oct 10 09:17:32 crc kubenswrapper[4822]: I1010 09:17:32.100920 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerStarted","Data":"e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4"} Oct 10 09:17:32 crc kubenswrapper[4822]: I1010 09:17:32.100940 4822 scope.go:117] "RemoveContainer" containerID="b132b1747dec78161390b84d99170cdeb3a2a302a551a7ba8f508952b5fd3e35" Oct 10 09:17:36 crc kubenswrapper[4822]: I1010 09:17:36.451493 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-nzzr4_e280e687-d626-4b27-bcef-9257b81b8b12/kube-rbac-proxy/0.log" Oct 10 09:17:36 crc kubenswrapper[4822]: I1010 09:17:36.647970 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-l5c7z_ccf84349-3882-419d-8349-90a71b1a70cc/frr-k8s-webhook-server/0.log" Oct 10 09:17:36 crc kubenswrapper[4822]: I1010 09:17:36.835060 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-nzzr4_e280e687-d626-4b27-bcef-9257b81b8b12/controller/0.log" Oct 10 09:17:36 crc kubenswrapper[4822]: I1010 09:17:36.917335 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-frr-files/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.042565 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-metrics/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.044918 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-frr-files/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.044940 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-reloader/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.121497 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-reloader/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.267179 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-frr-files/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.276318 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-reloader/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.329513 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-metrics/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.333891 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-metrics/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.543294 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-frr-files/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.545463 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-reloader/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.578913 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/cp-metrics/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.583794 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/controller/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.724910 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/frr-metrics/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.789343 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/kube-rbac-proxy-frr/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.809091 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/kube-rbac-proxy/0.log" Oct 10 09:17:37 crc kubenswrapper[4822]: I1010 09:17:37.921965 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/reloader/0.log" Oct 10 09:17:38 crc kubenswrapper[4822]: I1010 09:17:38.064428 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-746cb4bdc6-l6fk7_f3981699-53fd-4702-b1b4-e5a948937551/manager/0.log" Oct 10 09:17:38 crc kubenswrapper[4822]: I1010 09:17:38.245458 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b74789bd-zcp27_5ea5d77d-f95c-469d-8dcd-f02605187e89/webhook-server/0.log" Oct 10 09:17:38 crc kubenswrapper[4822]: I1010 09:17:38.855965 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-sfhk5_94de9084-9149-456e-9e20-9415eebcd145/kube-rbac-proxy/0.log" Oct 10 09:17:39 crc kubenswrapper[4822]: I1010 09:17:39.733311 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-sfhk5_94de9084-9149-456e-9e20-9415eebcd145/speaker/0.log" Oct 10 09:17:41 crc kubenswrapper[4822]: I1010 09:17:41.508880 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xq8m5_457e5c37-7370-4959-9199-3217ee9b5b26/frr/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.131413 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/util/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.268124 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/util/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.312791 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/pull/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.380202 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/pull/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.540556 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/extract/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.541818 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/util/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.571922 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69275c2_e68f3d31-d2b5-469f-bc4e-b6b89fe95cc9/pull/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.716723 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/util/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.885543 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/util/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.928026 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/pull/0.log" Oct 10 09:17:52 crc kubenswrapper[4822]: I1010 09:17:52.934400 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/pull/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.075068 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/util/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.076939 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/pull/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.104631 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2brzvp_eaf4d641-b224-4693-b0ad-b9dd73bd0681/extract/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.249156 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/util/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.410729 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/util/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.448496 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/pull/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.448578 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/pull/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.587728 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/util/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.602185 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/pull/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.620926 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2db2t42_13771e73-9a69-4d77-91da-c3d6f058b6b3/extract/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.755589 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/extract-utilities/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.929995 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/extract-content/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.936669 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/extract-utilities/0.log" Oct 10 09:17:53 crc kubenswrapper[4822]: I1010 09:17:53.971366 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/extract-content/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.112567 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/extract-utilities/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.174969 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/extract-content/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.376251 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/extract-utilities/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.557787 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/extract-utilities/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.612347 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/extract-content/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.616671 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/extract-content/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.869999 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/extract-content/0.log" Oct 10 09:17:54 crc kubenswrapper[4822]: I1010 09:17:54.890944 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/extract-utilities/0.log" Oct 10 09:17:55 crc kubenswrapper[4822]: I1010 09:17:55.204067 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/util/0.log" Oct 10 09:17:55 crc kubenswrapper[4822]: I1010 09:17:55.304004 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/util/0.log" Oct 10 09:17:55 crc kubenswrapper[4822]: I1010 09:17:55.463879 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/pull/0.log" Oct 10 09:17:55 crc kubenswrapper[4822]: I1010 09:17:55.474719 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/pull/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.171165 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/util/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.206786 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/pull/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.238939 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c6rhpw_982470a6-c29e-4e2a-a83b-073df14ec4ff/extract/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.364503 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fmd4b_4dbd1ffe-6c24-4eae-861f-de345f3f855f/registry-server/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.393337 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7zj69_8a112c7a-6133-483e-b34c-f12bfcd7a4ac/marketplace-operator/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.454574 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6wz4_5bca889e-d4b2-425c-9238-bbd38169d397/registry-server/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.564091 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/extract-utilities/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.676202 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/extract-utilities/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.697036 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/extract-content/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.732849 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/extract-content/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.920406 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/extract-utilities/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.982421 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/extract-content/0.log" Oct 10 09:17:56 crc kubenswrapper[4822]: I1010 09:17:56.991046 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/extract-utilities/0.log" Oct 10 09:17:57 crc kubenswrapper[4822]: I1010 09:17:57.160410 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/extract-utilities/0.log" Oct 10 09:17:57 crc kubenswrapper[4822]: I1010 09:17:57.211398 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/extract-content/0.log" Oct 10 09:17:57 crc kubenswrapper[4822]: I1010 09:17:57.225696 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/extract-content/0.log" Oct 10 09:17:57 crc kubenswrapper[4822]: I1010 09:17:57.279400 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6xmrd_4e6c6fef-78fc-44a1-8838-562a7eb63f8c/registry-server/0.log" Oct 10 09:17:57 crc kubenswrapper[4822]: I1010 09:17:57.905031 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/extract-content/0.log" Oct 10 09:17:57 crc kubenswrapper[4822]: I1010 09:17:57.913695 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/extract-utilities/0.log" Oct 10 09:17:59 crc kubenswrapper[4822]: I1010 09:17:59.274690 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6qnhn_6b8cf7ab-4f72-4127-9e04-ef062701505a/registry-server/0.log" Oct 10 09:18:10 crc kubenswrapper[4822]: I1010 09:18:10.768596 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-22bmw_1d0456c8-3612-481a-a98d-369c33a68812/prometheus-operator/0.log" Oct 10 09:18:10 crc kubenswrapper[4822]: I1010 09:18:10.916397 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b57bc545d-jdkhg_aa3f5246-4973-4251-990f-4e6089a952ad/prometheus-operator-admission-webhook/0.log" Oct 10 09:18:10 crc kubenswrapper[4822]: I1010 09:18:10.951197 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b57bc545d-xsxj4_70fba8c6-e26c-4600-857e-8728d6a7095e/prometheus-operator-admission-webhook/0.log" Oct 10 09:18:11 crc kubenswrapper[4822]: I1010 09:18:11.171513 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-xhnjq_97c2c1f4-1f4a-4f37-9435-80f0b49de473/operator/0.log" Oct 10 09:18:11 crc kubenswrapper[4822]: I1010 09:18:11.245770 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-slvtm_59d48973-1a2f-48f9-b685-62961213d13e/perses-operator/0.log" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.402290 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc295"] Oct 10 09:18:29 crc kubenswrapper[4822]: E1010 09:18:29.403264 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fef64b0-4014-44ba-9f23-5e43dcadd413" containerName="collect-profiles" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.403277 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fef64b0-4014-44ba-9f23-5e43dcadd413" containerName="collect-profiles" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.403506 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fef64b0-4014-44ba-9f23-5e43dcadd413" containerName="collect-profiles" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.405069 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.425564 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc295"] Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.536451 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9hg\" (UniqueName: \"kubernetes.io/projected/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-kube-api-access-5f9hg\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.536612 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-utilities\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.536687 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-catalog-content\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.640520 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9hg\" (UniqueName: \"kubernetes.io/projected/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-kube-api-access-5f9hg\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.640643 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-utilities\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.640723 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-catalog-content\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.641228 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-catalog-content\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.642135 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-utilities\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.664922 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9hg\" (UniqueName: \"kubernetes.io/projected/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-kube-api-access-5f9hg\") pod \"redhat-operators-nc295\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:29 crc kubenswrapper[4822]: I1010 09:18:29.733903 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:30 crc kubenswrapper[4822]: I1010 09:18:30.253604 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc295"] Oct 10 09:18:30 crc kubenswrapper[4822]: I1010 09:18:30.749120 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerStarted","Data":"08e65db5f5b1ef6eb36ecf726b554f7398051d713126e16be24331c786af602d"} Oct 10 09:18:30 crc kubenswrapper[4822]: I1010 09:18:30.749454 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerStarted","Data":"51a55bb1ac3007ad464b835b84e3c732053d553375a6aebd0f0abc1b32a59618"} Oct 10 09:18:31 crc kubenswrapper[4822]: I1010 09:18:31.837076 4822 generic.go:334] "Generic (PLEG): container finished" podID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerID="08e65db5f5b1ef6eb36ecf726b554f7398051d713126e16be24331c786af602d" exitCode=0 Oct 10 09:18:31 crc kubenswrapper[4822]: I1010 09:18:31.837451 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerDied","Data":"08e65db5f5b1ef6eb36ecf726b554f7398051d713126e16be24331c786af602d"} Oct 10 09:18:31 crc kubenswrapper[4822]: I1010 09:18:31.844262 4822 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:18:32 crc kubenswrapper[4822]: I1010 09:18:32.849972 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerStarted","Data":"d85d8b14b63d280882233d5f083ad042dc1c2637060c194402926adcd7007cc7"} Oct 10 09:18:38 crc kubenswrapper[4822]: I1010 09:18:38.917826 4822 generic.go:334] "Generic (PLEG): container finished" podID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerID="d85d8b14b63d280882233d5f083ad042dc1c2637060c194402926adcd7007cc7" exitCode=0 Oct 10 09:18:38 crc kubenswrapper[4822]: I1010 09:18:38.918020 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerDied","Data":"d85d8b14b63d280882233d5f083ad042dc1c2637060c194402926adcd7007cc7"} Oct 10 09:18:40 crc kubenswrapper[4822]: I1010 09:18:40.940504 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerStarted","Data":"198a5309edb02d0e5d5bde844f3be8e5bf8ae71c1c01586601a0e424b8616e4f"} Oct 10 09:18:40 crc kubenswrapper[4822]: I1010 09:18:40.970548 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc295" podStartSLOduration=4.109578762 podStartE2EDuration="11.970528932s" podCreationTimestamp="2025-10-10 09:18:29 +0000 UTC" firstStartedPulling="2025-10-10 09:18:31.844036429 +0000 UTC m=+10458.939194625" lastFinishedPulling="2025-10-10 09:18:39.704986599 +0000 UTC m=+10466.800144795" observedRunningTime="2025-10-10 09:18:40.961887633 +0000 UTC m=+10468.057045859" watchObservedRunningTime="2025-10-10 09:18:40.970528932 +0000 UTC m=+10468.065687118" Oct 10 09:18:49 crc kubenswrapper[4822]: I1010 09:18:49.736526 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:49 crc kubenswrapper[4822]: I1010 09:18:49.737754 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:49 crc kubenswrapper[4822]: I1010 09:18:49.837504 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:50 crc kubenswrapper[4822]: I1010 09:18:50.072196 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:50 crc kubenswrapper[4822]: I1010 09:18:50.121320 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc295"] Oct 10 09:18:52 crc kubenswrapper[4822]: I1010 09:18:52.043764 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc295" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="registry-server" containerID="cri-o://198a5309edb02d0e5d5bde844f3be8e5bf8ae71c1c01586601a0e424b8616e4f" gracePeriod=2 Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.057721 4822 generic.go:334] "Generic (PLEG): container finished" podID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerID="198a5309edb02d0e5d5bde844f3be8e5bf8ae71c1c01586601a0e424b8616e4f" exitCode=0 Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.058070 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerDied","Data":"198a5309edb02d0e5d5bde844f3be8e5bf8ae71c1c01586601a0e424b8616e4f"} Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.447189 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.586241 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-utilities\") pod \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.586422 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-catalog-content\") pod \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.586548 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f9hg\" (UniqueName: \"kubernetes.io/projected/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-kube-api-access-5f9hg\") pod \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\" (UID: \"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a\") " Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.593792 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-kube-api-access-5f9hg" (OuterVolumeSpecName: "kube-api-access-5f9hg") pod "b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" (UID: "b71e39ba-307a-42d3-9d31-1a9ba95fbd4a"). InnerVolumeSpecName "kube-api-access-5f9hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.599332 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-utilities" (OuterVolumeSpecName: "utilities") pod "b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" (UID: "b71e39ba-307a-42d3-9d31-1a9ba95fbd4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.690075 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" (UID: "b71e39ba-307a-42d3-9d31-1a9ba95fbd4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.690904 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.690943 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:53 crc kubenswrapper[4822]: I1010 09:18:53.691004 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f9hg\" (UniqueName: \"kubernetes.io/projected/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a-kube-api-access-5f9hg\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.069330 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc295" event={"ID":"b71e39ba-307a-42d3-9d31-1a9ba95fbd4a","Type":"ContainerDied","Data":"51a55bb1ac3007ad464b835b84e3c732053d553375a6aebd0f0abc1b32a59618"} Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.069394 4822 scope.go:117] "RemoveContainer" containerID="198a5309edb02d0e5d5bde844f3be8e5bf8ae71c1c01586601a0e424b8616e4f" Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.069551 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc295" Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.101590 4822 scope.go:117] "RemoveContainer" containerID="d85d8b14b63d280882233d5f083ad042dc1c2637060c194402926adcd7007cc7" Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.123569 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc295"] Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.126845 4822 scope.go:117] "RemoveContainer" containerID="08e65db5f5b1ef6eb36ecf726b554f7398051d713126e16be24331c786af602d" Oct 10 09:18:54 crc kubenswrapper[4822]: I1010 09:18:54.136301 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc295"] Oct 10 09:18:55 crc kubenswrapper[4822]: I1010 09:18:55.669420 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" path="/var/lib/kubelet/pods/b71e39ba-307a-42d3-9d31-1a9ba95fbd4a/volumes" Oct 10 09:19:31 crc kubenswrapper[4822]: I1010 09:19:31.337717 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:19:31 crc kubenswrapper[4822]: I1010 09:19:31.338510 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:19:41 crc kubenswrapper[4822]: I1010 09:19:41.055255 4822 scope.go:117] "RemoveContainer" containerID="3f56ff33be5c85001d3cf546d186c8d068bba6cc16669c37d6172e6d1c185c7c" Oct 10 09:20:01 crc kubenswrapper[4822]: I1010 09:20:01.336581 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:20:01 crc kubenswrapper[4822]: I1010 09:20:01.337161 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:20:24 crc kubenswrapper[4822]: I1010 09:20:24.178232 4822 generic.go:334] "Generic (PLEG): container finished" podID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerID="170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea" exitCode=0 Oct 10 09:20:24 crc kubenswrapper[4822]: I1010 09:20:24.178312 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" event={"ID":"0673d647-f8a4-43da-b9ee-019099d7fa5d","Type":"ContainerDied","Data":"170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea"} Oct 10 09:20:24 crc kubenswrapper[4822]: I1010 09:20:24.179706 4822 scope.go:117] "RemoveContainer" containerID="170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea" Oct 10 09:20:24 crc kubenswrapper[4822]: I1010 09:20:24.429598 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lj4jx_must-gather-v8fr6_0673d647-f8a4-43da-b9ee-019099d7fa5d/gather/0.log" Oct 10 09:20:31 crc kubenswrapper[4822]: I1010 09:20:31.336919 4822 patch_prober.go:28] interesting pod/machine-config-daemon-w2fl5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:20:31 crc kubenswrapper[4822]: I1010 09:20:31.337698 4822 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:20:31 crc kubenswrapper[4822]: I1010 09:20:31.337869 4822 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" Oct 10 09:20:31 crc kubenswrapper[4822]: I1010 09:20:31.338892 4822 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4"} pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:20:31 crc kubenswrapper[4822]: I1010 09:20:31.338980 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerName="machine-config-daemon" containerID="cri-o://e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" gracePeriod=600 Oct 10 09:20:31 crc kubenswrapper[4822]: E1010 09:20:31.465287 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:20:32 crc kubenswrapper[4822]: I1010 09:20:32.267271 4822 generic.go:334] "Generic (PLEG): container finished" podID="86167202-f72a-4271-bdbe-32ba0bf71fff" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" exitCode=0 Oct 10 09:20:32 crc kubenswrapper[4822]: I1010 09:20:32.267356 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" event={"ID":"86167202-f72a-4271-bdbe-32ba0bf71fff","Type":"ContainerDied","Data":"e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4"} Oct 10 09:20:32 crc kubenswrapper[4822]: I1010 09:20:32.267417 4822 scope.go:117] "RemoveContainer" containerID="aac138ea0c0a0c7046a707fa2212806ddc6545488f8b72f58a6bc2b4d54d5d31" Oct 10 09:20:32 crc kubenswrapper[4822]: I1010 09:20:32.268276 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:20:32 crc kubenswrapper[4822]: E1010 09:20:32.268849 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:20:33 crc kubenswrapper[4822]: I1010 09:20:33.440838 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lj4jx/must-gather-v8fr6"] Oct 10 09:20:33 crc kubenswrapper[4822]: I1010 09:20:33.441400 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="copy" containerID="cri-o://b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb" gracePeriod=2 Oct 10 09:20:33 crc kubenswrapper[4822]: I1010 09:20:33.451424 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lj4jx/must-gather-v8fr6"] Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.007920 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lj4jx_must-gather-v8fr6_0673d647-f8a4-43da-b9ee-019099d7fa5d/copy/0.log" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.008470 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.047681 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0673d647-f8a4-43da-b9ee-019099d7fa5d-must-gather-output\") pod \"0673d647-f8a4-43da-b9ee-019099d7fa5d\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.047744 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c79k9\" (UniqueName: \"kubernetes.io/projected/0673d647-f8a4-43da-b9ee-019099d7fa5d-kube-api-access-c79k9\") pod \"0673d647-f8a4-43da-b9ee-019099d7fa5d\" (UID: \"0673d647-f8a4-43da-b9ee-019099d7fa5d\") " Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.053191 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0673d647-f8a4-43da-b9ee-019099d7fa5d-kube-api-access-c79k9" (OuterVolumeSpecName: "kube-api-access-c79k9") pod "0673d647-f8a4-43da-b9ee-019099d7fa5d" (UID: "0673d647-f8a4-43da-b9ee-019099d7fa5d"). InnerVolumeSpecName "kube-api-access-c79k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.150505 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c79k9\" (UniqueName: \"kubernetes.io/projected/0673d647-f8a4-43da-b9ee-019099d7fa5d-kube-api-access-c79k9\") on node \"crc\" DevicePath \"\"" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.239060 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0673d647-f8a4-43da-b9ee-019099d7fa5d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0673d647-f8a4-43da-b9ee-019099d7fa5d" (UID: "0673d647-f8a4-43da-b9ee-019099d7fa5d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.252523 4822 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0673d647-f8a4-43da-b9ee-019099d7fa5d-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.292545 4822 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lj4jx_must-gather-v8fr6_0673d647-f8a4-43da-b9ee-019099d7fa5d/copy/0.log" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.292951 4822 generic.go:334] "Generic (PLEG): container finished" podID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerID="b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb" exitCode=143 Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.293008 4822 scope.go:117] "RemoveContainer" containerID="b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.293218 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj4jx/must-gather-v8fr6" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.320634 4822 scope.go:117] "RemoveContainer" containerID="170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.404405 4822 scope.go:117] "RemoveContainer" containerID="b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb" Oct 10 09:20:34 crc kubenswrapper[4822]: E1010 09:20:34.404842 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb\": container with ID starting with b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb not found: ID does not exist" containerID="b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.404879 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb"} err="failed to get container status \"b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb\": rpc error: code = NotFound desc = could not find container \"b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb\": container with ID starting with b9fcde57f1528c1e71ac63b56467d7b946e9a7a7c7df2447299cb95873f976eb not found: ID does not exist" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.404901 4822 scope.go:117] "RemoveContainer" containerID="170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea" Oct 10 09:20:34 crc kubenswrapper[4822]: E1010 09:20:34.405164 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea\": container with ID starting with 170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea not found: ID does not exist" containerID="170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea" Oct 10 09:20:34 crc kubenswrapper[4822]: I1010 09:20:34.405244 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea"} err="failed to get container status \"170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea\": rpc error: code = NotFound desc = could not find container \"170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea\": container with ID starting with 170249a38ed5f22546553d57e8884edb100a3760d89a9acfba55f298174a23ea not found: ID does not exist" Oct 10 09:20:35 crc kubenswrapper[4822]: I1010 09:20:35.665270 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" path="/var/lib/kubelet/pods/0673d647-f8a4-43da-b9ee-019099d7fa5d/volumes" Oct 10 09:20:45 crc kubenswrapper[4822]: I1010 09:20:45.650532 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:20:45 crc kubenswrapper[4822]: E1010 09:20:45.651578 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:20:59 crc kubenswrapper[4822]: I1010 09:20:59.653081 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:20:59 crc kubenswrapper[4822]: E1010 09:20:59.653873 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:21:14 crc kubenswrapper[4822]: I1010 09:21:14.651744 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:21:14 crc kubenswrapper[4822]: E1010 09:21:14.653190 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:21:25 crc kubenswrapper[4822]: I1010 09:21:25.650776 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:21:25 crc kubenswrapper[4822]: E1010 09:21:25.653492 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:21:39 crc kubenswrapper[4822]: I1010 09:21:39.650850 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:21:39 crc kubenswrapper[4822]: E1010 09:21:39.651705 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:21:50 crc kubenswrapper[4822]: I1010 09:21:50.651510 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:21:50 crc kubenswrapper[4822]: E1010 09:21:50.652624 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:22:05 crc kubenswrapper[4822]: I1010 09:22:05.654417 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:22:05 crc kubenswrapper[4822]: E1010 09:22:05.655674 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.510842 4822 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwjlv"] Oct 10 09:22:11 crc kubenswrapper[4822]: E1010 09:22:11.512203 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="gather" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512226 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="gather" Oct 10 09:22:11 crc kubenswrapper[4822]: E1010 09:22:11.512273 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="extract-content" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512286 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="extract-content" Oct 10 09:22:11 crc kubenswrapper[4822]: E1010 09:22:11.512348 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="registry-server" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512362 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="registry-server" Oct 10 09:22:11 crc kubenswrapper[4822]: E1010 09:22:11.512391 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="copy" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512405 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="copy" Oct 10 09:22:11 crc kubenswrapper[4822]: E1010 09:22:11.512433 4822 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="extract-utilities" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512447 4822 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="extract-utilities" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512862 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="gather" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512901 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71e39ba-307a-42d3-9d31-1a9ba95fbd4a" containerName="registry-server" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.512924 4822 memory_manager.go:354] "RemoveStaleState removing state" podUID="0673d647-f8a4-43da-b9ee-019099d7fa5d" containerName="copy" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.515979 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.523653 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwjlv"] Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.670741 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-catalog-content\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.670796 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq95h\" (UniqueName: \"kubernetes.io/projected/dbabde7e-381a-4c42-99ee-06b6d85e1a35-kube-api-access-vq95h\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.670927 4822 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-utilities\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.773221 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-catalog-content\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.773317 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq95h\" (UniqueName: \"kubernetes.io/projected/dbabde7e-381a-4c42-99ee-06b6d85e1a35-kube-api-access-vq95h\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.773395 4822 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-utilities\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.774007 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-catalog-content\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.774083 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-utilities\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.794197 4822 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq95h\" (UniqueName: \"kubernetes.io/projected/dbabde7e-381a-4c42-99ee-06b6d85e1a35-kube-api-access-vq95h\") pod \"certified-operators-vwjlv\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:11 crc kubenswrapper[4822]: I1010 09:22:11.840236 4822 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:12 crc kubenswrapper[4822]: I1010 09:22:12.390113 4822 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwjlv"] Oct 10 09:22:12 crc kubenswrapper[4822]: I1010 09:22:12.510883 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerStarted","Data":"6682622400cfe78449c42b3c306c8c716964b385b222cf96dbb1076bfa9b6357"} Oct 10 09:22:13 crc kubenswrapper[4822]: I1010 09:22:13.527652 4822 generic.go:334] "Generic (PLEG): container finished" podID="dbabde7e-381a-4c42-99ee-06b6d85e1a35" containerID="0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840" exitCode=0 Oct 10 09:22:13 crc kubenswrapper[4822]: I1010 09:22:13.527721 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerDied","Data":"0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840"} Oct 10 09:22:14 crc kubenswrapper[4822]: I1010 09:22:14.541072 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerStarted","Data":"0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391"} Oct 10 09:22:15 crc kubenswrapper[4822]: I1010 09:22:15.555459 4822 generic.go:334] "Generic (PLEG): container finished" podID="dbabde7e-381a-4c42-99ee-06b6d85e1a35" containerID="0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391" exitCode=0 Oct 10 09:22:15 crc kubenswrapper[4822]: I1010 09:22:15.555875 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerDied","Data":"0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391"} Oct 10 09:22:16 crc kubenswrapper[4822]: I1010 09:22:16.571552 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerStarted","Data":"2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654"} Oct 10 09:22:16 crc kubenswrapper[4822]: I1010 09:22:16.607672 4822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwjlv" podStartSLOduration=3.145665155 podStartE2EDuration="5.607650602s" podCreationTimestamp="2025-10-10 09:22:11 +0000 UTC" firstStartedPulling="2025-10-10 09:22:13.532471047 +0000 UTC m=+10680.627629243" lastFinishedPulling="2025-10-10 09:22:15.994456484 +0000 UTC m=+10683.089614690" observedRunningTime="2025-10-10 09:22:16.585672129 +0000 UTC m=+10683.680830355" watchObservedRunningTime="2025-10-10 09:22:16.607650602 +0000 UTC m=+10683.702808808" Oct 10 09:22:20 crc kubenswrapper[4822]: I1010 09:22:20.650962 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:22:20 crc kubenswrapper[4822]: E1010 09:22:20.652767 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:22:21 crc kubenswrapper[4822]: I1010 09:22:21.841249 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:21 crc kubenswrapper[4822]: I1010 09:22:21.841401 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:22 crc kubenswrapper[4822]: I1010 09:22:22.129411 4822 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:22 crc kubenswrapper[4822]: I1010 09:22:22.728531 4822 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:22 crc kubenswrapper[4822]: I1010 09:22:22.793669 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwjlv"] Oct 10 09:22:24 crc kubenswrapper[4822]: I1010 09:22:24.661417 4822 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vwjlv" podUID="dbabde7e-381a-4c42-99ee-06b6d85e1a35" containerName="registry-server" containerID="cri-o://2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654" gracePeriod=2 Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.134852 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.299514 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-utilities\") pod \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.299642 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq95h\" (UniqueName: \"kubernetes.io/projected/dbabde7e-381a-4c42-99ee-06b6d85e1a35-kube-api-access-vq95h\") pod \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.299728 4822 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-catalog-content\") pod \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\" (UID: \"dbabde7e-381a-4c42-99ee-06b6d85e1a35\") " Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.301843 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-utilities" (OuterVolumeSpecName: "utilities") pod "dbabde7e-381a-4c42-99ee-06b6d85e1a35" (UID: "dbabde7e-381a-4c42-99ee-06b6d85e1a35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.306283 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbabde7e-381a-4c42-99ee-06b6d85e1a35-kube-api-access-vq95h" (OuterVolumeSpecName: "kube-api-access-vq95h") pod "dbabde7e-381a-4c42-99ee-06b6d85e1a35" (UID: "dbabde7e-381a-4c42-99ee-06b6d85e1a35"). InnerVolumeSpecName "kube-api-access-vq95h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.402071 4822 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.402104 4822 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq95h\" (UniqueName: \"kubernetes.io/projected/dbabde7e-381a-4c42-99ee-06b6d85e1a35-kube-api-access-vq95h\") on node \"crc\" DevicePath \"\"" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.467235 4822 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbabde7e-381a-4c42-99ee-06b6d85e1a35" (UID: "dbabde7e-381a-4c42-99ee-06b6d85e1a35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.504452 4822 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbabde7e-381a-4c42-99ee-06b6d85e1a35-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.692899 4822 generic.go:334] "Generic (PLEG): container finished" podID="dbabde7e-381a-4c42-99ee-06b6d85e1a35" containerID="2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654" exitCode=0 Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.692956 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerDied","Data":"2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654"} Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.692972 4822 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwjlv" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.692986 4822 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwjlv" event={"ID":"dbabde7e-381a-4c42-99ee-06b6d85e1a35","Type":"ContainerDied","Data":"6682622400cfe78449c42b3c306c8c716964b385b222cf96dbb1076bfa9b6357"} Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.693008 4822 scope.go:117] "RemoveContainer" containerID="2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.724399 4822 scope.go:117] "RemoveContainer" containerID="0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.729281 4822 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwjlv"] Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.739923 4822 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vwjlv"] Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.746998 4822 scope.go:117] "RemoveContainer" containerID="0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.792570 4822 scope.go:117] "RemoveContainer" containerID="2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654" Oct 10 09:22:25 crc kubenswrapper[4822]: E1010 09:22:25.793129 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654\": container with ID starting with 2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654 not found: ID does not exist" containerID="2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.793168 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654"} err="failed to get container status \"2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654\": rpc error: code = NotFound desc = could not find container \"2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654\": container with ID starting with 2cb54605ae086a252b2caa93363053c998cccf1293563d665c3ce6694e548654 not found: ID does not exist" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.793194 4822 scope.go:117] "RemoveContainer" containerID="0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391" Oct 10 09:22:25 crc kubenswrapper[4822]: E1010 09:22:25.793491 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391\": container with ID starting with 0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391 not found: ID does not exist" containerID="0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.793521 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391"} err="failed to get container status \"0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391\": rpc error: code = NotFound desc = could not find container \"0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391\": container with ID starting with 0af8660d5aa847d9220cd0cccd28834c5d75e22da83705dfe2b855e82b1b0391 not found: ID does not exist" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.793539 4822 scope.go:117] "RemoveContainer" containerID="0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840" Oct 10 09:22:25 crc kubenswrapper[4822]: E1010 09:22:25.793778 4822 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840\": container with ID starting with 0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840 not found: ID does not exist" containerID="0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840" Oct 10 09:22:25 crc kubenswrapper[4822]: I1010 09:22:25.793818 4822 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840"} err="failed to get container status \"0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840\": rpc error: code = NotFound desc = could not find container \"0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840\": container with ID starting with 0802e8b009f27dc0efc04b5c41b71e8a481a482b3601dec040a6989c0b378840 not found: ID does not exist" Oct 10 09:22:27 crc kubenswrapper[4822]: I1010 09:22:27.667032 4822 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbabde7e-381a-4c42-99ee-06b6d85e1a35" path="/var/lib/kubelet/pods/dbabde7e-381a-4c42-99ee-06b6d85e1a35/volumes" Oct 10 09:22:33 crc kubenswrapper[4822]: I1010 09:22:33.679887 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:22:33 crc kubenswrapper[4822]: E1010 09:22:33.682695 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff" Oct 10 09:22:47 crc kubenswrapper[4822]: I1010 09:22:47.654439 4822 scope.go:117] "RemoveContainer" containerID="e2c1257cc2ac4e26777fdff9b4ca90980010baddce993597a5ca8d6e4e2d69a4" Oct 10 09:22:47 crc kubenswrapper[4822]: E1010 09:22:47.655562 4822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w2fl5_openshift-machine-config-operator(86167202-f72a-4271-bdbe-32ba0bf71fff)\"" pod="openshift-machine-config-operator/machine-config-daemon-w2fl5" podUID="86167202-f72a-4271-bdbe-32ba0bf71fff"